Linked-List
Question 1 |
![]() | |
![]() | |
![]() | |
![]() | |
option A and option B |
When we are inserting an element in to empty linked list and to perform sorted order list of every element will take O(n^2)
Explanation:
The Linked list insertion operations will take O(1) time. It means a constant amount of time for insertion.
Total number of elements inserted into an empty linked list is O(n). So, it will take O(n) time in the worst case.
After inserting elements into an empty linked list we have to perform sorting operation.
To get minimum time complexity to perform sorting order is merge sort. It will give O(nlogn) time complexity only.
Let head be the first node of the linked list to be sorted and head Reference be the pointer to head. The head in MergeSort as the below implementation changes next links to sort the linked lists (not data at the nodes), so head node has to be changed if the data at the original head is not the smallest value in the linked list.
Note: There are other sorting methods also will give decent time complexity but quicksort will give O(n^2) and heap sort will not be suitable to apply.
Method-2:
When we are inserting an element into an empty linked list and to perform a sorted list after inserting all elements will take O(nlogn) time.
Explanation:
The Linked list insertion operations will take O(1) time. It means a constant amount of time for insertion.
Total number of elements inserted into an empty linked list is O(n). So, it will take O(n) time in the worst case. After inserting elements into an empty linked list we have to perform sorting operations.
To get minimum time complexity to perform sorting order is merge sort. It will give O(nlogn) time complexity only. Let head be the first node of the linked list to be sorted and head Reference be the pointer to head. The head in Merge Sort as the below implementation changes next links to sort the linked lists (not data at the nodes), so head node has to be changed if the data at the original head is not the smallest value in the linked list.
Note: There are other sorting methods also will give decent time complexity but quicksort will give O(n^2) and heap sort will not be suitable to apply.
Note: They are given “needs to be maintained in sorted order” but not given clear conditions to sort an elements.
Question 2 |
Consider the C code fragment given below.
typedef struct node { int data; node* next; } node; void join(node* m, node* n) { node* p = n; while(p→next !=NULL) { p = p→next; } p→next=m; }
Assuming that m and n point to valid NULL-terminated linked lists, invocation of join will
append list m to the end of list n for all inputs. | |
either cause a null pointer dereference or append list m to the end of list n. | |
cause a null pointer dereference for all inputs. | |
append list n to the end of list m for all inputs. |
So have to consider both the cases.
⇾ 1. Lists are not null, invocation of JOIN will append list m to the end of list n.
m = 1, 2, 3
n = 4, 5, 6
After join, it becomes 4, 5, 6, 1, 2, 3, NULL.
⇾ 2. If lists are null, if the list n is empty, and itself NULL, then joining & referencing would obviously create a null pointer issue.
Hence, it may either cause a NULL pointer dereference or appends the list m to the end of list n.
Question 3 |
N items are stored in a sorted doubly linked list. For a delete operation, a pointer is provided to the record to be deleted. For a decrease-key operation, a pointer is provided to the record on which the operation is to be performed.
An algorithm performs the following operations on the list in this order: Θ(N) delete,O(logN) insert, O(logN) find, and Θ(N) decrease-key. What is the time complexity of all these operations put together?
O(log2 N) | |
O(N) | |
O(N2) | |
Θ(N2 logN) |
→ Delete needs O(1) time
→ Insert takes O(N) time
→ Find takes O(N) time
→ Decrease by takes O(N) time
Now number of each operation performed is given, so finally total complexity,
→ Delete = O(1) × O(N) = O(N)
→ Find = O(N) × O(log N) = O(N log N)
→ Insert = O(N) × O(log N) = O(N log N)
→ Decrease key = O(N) × O(N) = O(N2)
So, overall time complexity is, O(N2).
Question 4 |
typedef struct node { int value; struct node *next; }Node; Node *move_to_front(Node *head) { Node *p, *q; if ((head == NULL: || (head->next == NULL)) return head; q = NULL; p = head; while (p-> next !=NULL) { q = p; p = p->next; } _______________________________ return head; } |
q = NULL; p→next = head; head = p;
| |
q→next = NULL; head = p; p→next = head; | |
head = p; p→next = q; q→next = NULL; | |
q→next = NULL; p→next = head; head = p;
|
The function modifies the list by moving the last element to the front of the list.
Let the list be 1 → 2 → 3 → 4 & the modified list must be 4 → 1 → 2 → 3.
Algorithm:
Traverse the list till last node. Use two pointers. One to store the address of last node & other for the address of second last node.
After completion of loop. Do these.
(i) Make second last as last
(ii) Set next of last as head
(iii) Make last as head

while (p → !=NULL)
{
q = p;
p = p → next;
}
― p is travelling till the end of list and assigning q to whatever p had visited & p takes next new node, so travels through the entire list.
Now the list looks like

According to the Algorithm new lines must be
q → next = NULL; p → next = head; head = p

Question 5 |
1,2,3,4,5,6,7 | |
2,1,4,3,6,5,7 | |
1,3,2,5,4,7,6 | |
2,3,4,5,6,7,1
|
After 1st Iteration: 2,1,3,4,5,6,7
2nd Iteration: 2,1,4,3,5,6,7
3rd Iteration: 2,1,4,3,6,5,7
After each exchange is done, it starts from unchanged elements due to p=q⟶next;

Hence 2,1,4,3,6,5,7.
Question 6 |
rear node
| |
front node
| |
not possible with a single pointer
| |
node next to front
|
Question 7 |
union only
| |
intersection, membership
| |
membership, cardinality
| |
union, intersection
|
Let no. of elements in list 2 be n2.
Union:
To union two lists, for each element in one list we will search in other list, to avoid duplicates. So, time complexity will be O(n1×n2).
Intersection:
To take intersection of two lists, for each element in one list we will search in other list if it is present or not. So time complexity will be O(n1 × n2).
Membership:
In this we search if particular element is present in the list or not. So time complexity will be O(n1 + n2).
Cardinality:
In this we find the size of set or list. So to find size of list we have to traverse each list once. So time complexity will be O(n1+n2).
Hence, Union and Intersection will be slowest.
Question 8 |
Let P be a singly linked list. Let Q be the pointer to an intermediate node x in the list. What is the worst-case time complexity of the best known algorithm to delete the node x from the list?
O(n) | |
O(log2 n) | |
O(logn) | |
O(1) |
Question 9 |
struct item { int data; struct item * next; }; int f( struct item *p) { return ((p == NULL) || (p->next == NULL) || ((P->data <= p->next->data) && f(p->next))); } |
the list is empty or has exactly one element | |
the elements in the list are sorted in non-decreasing order of data value
| |
the elements in the list are sorted in non-increasing order of data value
| |
not all elements in the list have the same data value
|
Question 10 |
log n | |
n/2 | |
(log2)n - 1 | |
n |
Question 11 |
Singly linked list | |
Doubly linked list | |
Circular doubly linked list | |
Array implementation of list |
Question 12 |
Insertion sort | |
Binary search | |
Radix sort | |
Polynomial manipulation
|
Question 13 |
One pointer. | |
Two pointers. | |
Multiple pointers. | |
No pointer. |
p → next = q → next
q → next = p
So, two pointers modifications.