Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Please give your reasons, thank you. (a) The load factor, lambda, of a hash tabl

ID: 3864492 • Letter: P

Question

Please give your reasons, thank you.

(a) The load factor, lambda, of a hash table doubles when we replace linear probing by quadratic probing. b) Quicksort is a fast sorting algorithm for most problems, but its running time is O(n^2) in the worst case. (c) When describing data structures and algorithms in computer science, a collection of trees is called a grove. (d) If vertex u comes before vertex v in a topological sort of the vertices from directed graph G, then there cannot be a directed path from v to u in G. (e) If every edge in graph G has a different weight, then we know that the shortest path from vertex u to vertex u is unique. (f) If all edge weights in graph G are the same (and equal to 1, say), then G might have more than one minimum spanning tree. (g) You can convert a breadth-first search (BFS) function into a depth-first search (DFS) function by changing its FIFO queue of pending vertices by a LIFO stack. (h) If all edge weights are equal, then Dijkstra's shortest path algorithm turns into depth-first search (DFS). (i) In a directed graph, the sum of the indegrees for every vertex must be equal to the sum of the outdegrees for every vertex. j) Path compression is a technique for increasing the maximum flow in a graph by deleting the smallest edge.

Explanation / Answer

Basically we all knows that Load factor is the ratio of p/q where p is defined as the number of entries and q is defined as the size of its bucket array Now basically in linear probing the table is search sequentially for an empty slot when collusion occurs,. This is accomplished using two values – one as an interval between successive values in modular arithmetic and the other as a starting value . The first value , is repeatedly added to the starting value until a free space is found, or the entire table is traversed and is basically same for all keys and known as the stepsize. Where as when approach transferred to Quadratic probing it basically operates by considering the original hash value and in t starting value it adds successive values of an arbitrary quadratic poly-nomial. The idea present in quadratic approach with possible clusters skip regions in the table. Which leads to sudden increase in load factor upto 2 times

For other questions to be answered please provide as seprate questions.

Thankyou