Algorithms Placement Preapration Stuff


hi friends these are some faq's on algorithms as per collected from various years and companies papers . hope this will you in your preparations..


1. What is an algorithm‘?
An algorithm consists of a finite set of steps that may require one or more operations. These operations should be definite and effective. An algorithm should produce one or more output‘s and may have zero or more inputs.
This consists of five distinct areas:
1. to device algorithms
2. to validate the algorithms
3. to express the algorithms
4. to analyse the algorithms
5. to test the programs for the algorithms
2. What is a ‗computational procedure‘?
An algorithm that does not terminate is called ‗computational procedure‘. Example for such ‗computational procedure‘ is an ‗operating system‘.
3. Define - ‗recursive algorithm‘.
An algorithm is said to be recursive if the same algorithm is invoked in the body of the algorithm. Recursive algorithms can be divided into direct recursive and indirect recursive algorithms.
Direct recursive:
An algorithm that calls itself.
Indirect Recursive:
An algorithm ‗A‘ is said to be indirect recursive if it calls another algorithm which in-turn calls algorithm ‗A‘.
4. How can you classify performance analysis?
‗Performance analysis‘ can be classified as:
i. priori analysis
ii. posteriori analysis
Priori Analysis:
The bounds of algorithms‘ computing time are obtained by formulating a function.
Posteriori Analysis:
Testing the actual computation of space and time are recorded while the algorithm is executing.
5. Define - ‗Big- O‘.
For the function f(n)
f(n)=O(g(n))
iff there exist positive constants c and d such that:
f(n) <=c*g(n)
for all n,n>=d.
This is defined to be the worst-time complexity of the function f(n).
For example:
O(n)=3n+2 because,
3n+2 <=4n for all n>=2.
6. Give various computing times and their meaning.
Few of the important computing times are:
Computing Time Meaning
O(1) : constant computing time
O(n) : linear computing time
O(n*n) : quadratic computing time
O(n*n*n) : cubic computing time
O(2*2*2*2*..............*n) : exponential computing time
7. Give the most important ‗basic designs‗ of algorithms.
There are five important basic designs for algorithms. They are:
i. Divide and conquer,
ii. The greedy method,
iii. Dynamic programming,
iv. Back-tracking,
v. Branch and bound.
8. How does ‗divide and conquer‘ algorithms work?
For a function to compute on n inputs the divide and conquer strategy suggests the inputs into a k distinct subsets, 1<=n, yielding k sub-problems. These sub-problems must be solved and then a method must be found to combine the sub-solutions into a solution of the whole.
An example for this approach is ‗binary search‘ algorithm. The time complexity of binary search algorithm is O(log n).
9. What is Greedy Method?
The greedy method suggests that one can devise an algorithm that works in stages, considering one input at a time. At each stage, a decision is made regarding whether a particular input is an optimal solution. An example for solution using greedy method is ‗knapsack problem‘.
10. What is Dynamic Programming?
Dynamic Programming is an algorithm design method that can be used when the solution to a problem can be viewed as the result of a sequence of decisions. An example for algorithm using dynamic programming is ‗multistage graphs‘.
11. What are the time complexities for the following algorithms?
Binary search : O(logn)
Finding maximum and minimum for a given set of numbers
: O(3n/2-2)
Merge Sort : O(nlogn)
Insertion Sort : O(n*n);
Quick Sort : O(nlogn)
Selection Sort : O(n)
12. What is the difference between Merge Sort and Quick sort?
Both Merge-sort and Quick-sort have same time complexity i.e. O(nlogn). In merge sort the file a[1:n] was divided at its midpoint into sub-arrays which are independently sorted and later merged. Whereas, in quick sort the division into two sub-arrays is made so that the sorted sub-arrays do not need to be merged latter.
13. Is there any optimum solution for Matrix multiplication?
Yes. Divide and conquer method suggests Strassen‘s matrix multiplication method to be used. If we follow this method, the time complexity is O(n*n*n……..*2.81) times rather O(n*n*n*………*3) times.
14. Define ‗minimum cost spanning method‘.
Let G=(V,E) be an undirected connected graph. A sub-graph t =(V, E‘) of G is a spanning tree of G if and only if t is a tree.
To find out minimum cost spanning method we have following method‘s;
Prim‘s Algorithm : O(n*n)
Kruskal‘s Algorithm : O(e loge)
15. Define ‘articulation points‘.
A Vertex V in a connected graph G is an articulation point if and only if the deletion of vertex V together will all edges incident for disconnects the graph into two or more non-empty Components.
16. Define ‗biconnected graph‘.
A graph G is biconnected if and only if it contains no articulation points.
17. What are ‗explicit‘ and ‗implicit‘ constraints?
‗Explicit constraints‘ are rules that restrict each xi to take on values only from a given set. ‗Implicit constraints‘ are rules that determine which of the tuples in the solution space of i satisfy the criterion function.
18. Give ‗Cookies – theorem‘.
Satisfiability is in P iff P=NP
where P is the set of all decision problems solvable by deterministic algorithms in polynomial time and NP is the set of all decision problems solvable by non-deterministic algorithms in a polynomial-time.


Reblog this post [with Zemanta]

0 comments:

Post a Comment