Question 1
What is the worst case time complexity of inserting n elements into an empty linked list, if the linked list needs to be maintained in sorted order?
 A B C D E option A and option B
Question 1 Explanation:
Method-1:
When we are inserting an element in to empty linked list and to perform sorted order list of every element will take O(n^2)
Explanation:
The Linked list insertion operations will take O(1) time. It means a constant amount of time for insertion.
Total number of elements inserted into an empty linked list is O(n). So, it will take O(n) time in the worst case.
After inserting elements into an empty linked list we have to perform sorting operation.
To get minimum time complexity to perform sorting order is merge sort. It will give O(nlogn) time complexity only.
Let head be the first node of the linked list to be sorted and head Reference be the pointer to head. The head in MergeSort as the below implementation changes next links to sort the linked lists (not data at the nodes), so head node has to be changed if the data at the original head is not the smallest value in the linked list.
Note: There are other sorting methods also will give decent time complexity but quicksort will give O(n^2) and heap sort will not be suitable to apply.
Method-2:
When we are inserting an element into an empty linked list and to perform a sorted list after inserting all elements will take O(nlogn) time.
Explanation:
The Linked list insertion operations will take O(1) time. It means a constant amount of time for insertion.
Total number of elements inserted into an empty linked list is O(n). So, it will take O(n) time in the worst case. After inserting elements into an empty linked list we have to perform sorting operations.
To get minimum time complexity to perform sorting order is merge sort. It will give O(nlogn) time complexity only. Let head be the first node of the linked list to be sorted and head Reference be the pointer to head. The head in Merge Sort as the below implementation changes next links to sort the linked lists (not data at the nodes), so head node has to be changed if the data at the original head is not the smallest value in the linked list.
Note: There are other sorting methods also will give decent time complexity but quicksort will give O(n^2) and heap sort will not be suitable to apply.
Note: They are given “needs to be maintained in sorted order” but not given clear conditions to sort an elements.
 Question 2

Consider the C code fragment given below.

```typedef struct node   {
int data;
node* next;
}   node;

void join(node* m, node* n)   {
node*  p = n;
while(p→next  !=NULL)    {
p = p→next;
}
p→next=m;
}
```

Assuming that m and n point to valid NULL-terminated linked lists, invocation of join will

 A append list m to the end of list n for all inputs. B either cause a null pointer dereference or append list m to the end of list n. C cause a null pointer dereference for all inputs. D append list n to the end of list m for all inputs.
Question 2 Explanation:
As specified in the question m & n are valid lists, but there is no specific condition/ statement tells that lists are empty or not.
So have to consider both the cases.
⇾ 1. Lists are not null, invocation of JOIN will append list m to the end of list n.
m = 1, 2, 3
n = 4, 5, 6
After join, it becomes 4, 5, 6, 1, 2, 3, NULL.
⇾ 2. If lists are null, if the list n is empty, and itself NULL, then joining & referencing would obviously create a null pointer issue.
Hence, it may either cause a NULL pointer dereference or appends the list m to the end of list n.
 Question 3

N items are stored in a sorted doubly linked list. For a delete operation, a pointer is provided to the record to be deleted. For a decrease-key operation, a pointer is provided to the record on which the operation is to be performed.

An algorithm performs the following operations on the list in this order: Θ(Ndelete,O(logN) insert, O(logN) ﬁnd, and Θ(N) decrease-key. What is the time complexity of all these operations put together?

 A O(log2 N) B O(N) C O(N2) D Θ(N2 logN)
Question 3 Explanation:
→ Delete needs O(1) time
→ Insert takes O(N) time
→ Find takes O(N) time
→ Decrease by takes O(N) time
Now number of each operation performed is given, so finally total complexity,
→ Delete = O(1) × O(N) = O(N)
→ Find = O(N) × O(log N) = O(N log N)
→ Insert = O(N) × O(log N) = O(N log N)
→ Decrease key = O(N) × O(N) = O(N2)
So, overall time complexity is, O(N2).
 Question 4
The following C function takes a simply-linked list as input argument. It modifies the list by moving the last element to the front of the list and returns the modified list. Some part of the code is left blank.
 `typedef` `struct` `node ` `{` `  ``int` `value;` `  ``struct` `node *next;` `}Node;` ` ` `Node *move_to_front(Node *head) ` `{` `  ``Node *p, *q;` `  ``if` `((head == NULL: || (head->next == NULL)) ` `    ``return` `head;` `  ``q = NULL; p = head;` `  ``while` `(p-> next !=NULL) ` `  ``{` `    ``q = p;` `    ``p = p->next;` `  ``}` `  ``_______________________________` `  ``return` `head;` `}`

 A q = NULL; p→next = head; head = p; B q→next = NULL; head = p; p→next = head; C head = p; p→next = q; q→next = NULL; D q→next = NULL; p→next = head; head = p;
Question 4 Explanation:
C function takes a simple linked list as input argument.
The function modifies the list by moving the last element to the front of the list.
Let the list be 1 → 2 → 3 → 4 & the modified list must be 4 → 1 → 2 → 3.
Algorithm:
Traverse the list till last node. Use two pointers. One to store the address of last node & other for the address of second last node.
After completion of loop. Do these.
(i) Make second last as last
(ii) Set next of last as head

while (p → !=NULL)
{
q = p;
p = p → next;
}
― p is travelling till the end of list and assigning q to whatever p had visited & p takes next new node, so travels through the entire list.
Now the list looks like

According to the Algorithm new lines must be
q → next = NULL; p → next = head; head = p
 Question 5
The following C function takes a single-linked list of integers as a parameter and rearranges the elements of the list. The function is called with the list containing the integers 1,2,3,4,5,6,7 in the given order. What will be the contents of the list after the function completes execution? struct node              { int value; struct node * next; }; Void rearrange (struct node * list){ struct node * p, * q; int temp; if (!list     !list - > next)return; p = list; q = list - > next; while (q){ temp = p- > value;p- > value = q- > value; q- > value = temp;p = q- > next; q = p ? p- > next : 0; } }
 A 1,2,3,4,5,6,7 B 2,1,4,3,6,5,7 C 1,3,2,5,4,7,6 D 2,3,4,5,6,7,1
Question 5 Explanation:
The given list is 1,2,3,4,5,6,7.
After 1st Iteration: 2,1,3,4,5,6,7
2nd Iteration: 2,1,4,3,5,6,7
3rd Iteration: 2,1,4,3,6,5,7
After each exchange is done, it starts from unchanged elements due to p=q⟶next;

Hence 2,1,4,3,6,5,7.
 Question 6

 A rear node B front node C not possible with a single pointer D node next to front
Question 6 Explanation:
We can perform enqueue and dequeue from rear within O(1) time. But from the front node we cannot rear from front in O(1) time.
 Question 7
Suppose each set is represented as a linked list with elements in arbitrary order. Which of the operations among union, intersection, membership, cardinality will be the slowest?
 A union only B intersection, membership C membership, cardinality D union, intersection
Question 7 Explanation:
Let no. of elements in list 1 be n1.
Let no. of elements in list 2 be n2.
Union:
To union two lists, for each element in one list we will search in other list, to avoid duplicates. So, time complexity will be O(n1×n2).
Intersection:
To take intersection of two lists, for each element in one list we will search in other list if it is present or not. So time complexity will be O(n1 × n2).
Membership:
In this we search if particular element is present in the list or not. So time complexity will be O(n1 + n2).
Cardinality:
In this we find the size of set or list. So to find size of list we have to traverse each list once. So time complexity will be O(n1+n2).
Hence, Union and Intersection will be slowest.
 Question 8

Let P be a singly linked list. Let Q be the pointer to an intermediate node x in the list. What is the worst-case time complexity of the best known algorithm to delete the node x from the list?

 A O(n) B O(log2 n) C O(logn) D O(1)
Question 8 Explanation:
Worst case complexity for deleting a node from singly linked list is O(1).
 Question 9
Consider the function f defined below.
 `struct` `item` `{` `    ``int` `data;` `    ``struct` `item * next;` `};` `int` `f(``struct` `item *p)` `{` `    ``return` `((p == NULL) || (p->next == NULL) ||` `            ``((P->data <= p->next->data) &&` `            ``f(p->next)));` `}`
For a given linked list p, the function f returns 1 if and only if
 A the list is empty or has exactly one element B the elements in the list are sorted in non-decreasing order of data value C the elements in the list are sorted in non-increasing order of data value D not all elements in the list have the same data value
Question 9 Explanation:
It return a value '1' when the elements in the list are presented in sorted order and non-decreasing order of data value.
 Question 10
In the worst case, the number of comparisons needed to search a singly linked list of length n for a given element is
 A log n B n/2 C (log2)n - 1 D n
Question 10 Explanation:
Worst case time complexity of singly linked list is O(n). So we need n number of comparisons needed to search a singly linked list of length n.
 Question 11
The concatenation of two lists is to be performed on O(1) time. Which of the following implementations of a list should be used?
 A Singly linked list B Doubly linked list C Circular doubly linked list D Array implementation of list
Question 11 Explanation:
In circular doubly linked list concatenation of two lists is to be performed on O(1) time.
 Question 12
Linked lists are not suitable data structures of which one of the following problems?
 A Insertion sort B Binary search C Radix sort D Polynomial manipulation