ดาวน์โหลดงานนำเสนอ
งานนำเสนอกำลังจะดาวน์โหลด โปรดรอ
ได้พิมพ์โดยBandasak Larpthawornkiet ได้เปลี่ยน 10 ปีที่แล้ว
1
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of computer program design are two (sometimes conflicting) goals. To design an algorithm that is easy to understand, code, debug. To design an algorithm that makes efficient use of the computer’s resources.
2
Algorithm Efficiency (cont)
Goal (1) is the concern of Software Engineering. Goal (2) is the concern of data structures and algorithm analysis. When goal (2) is important, how do we measure an algorithm’s cost?
3
How to Measure Efficiency?
Empirical comparison (run programs) Asymptotic Algorithm Analysis Critical resources: Factors affecting running time: For most algorithms, running time depends on “size” of the input. Running time is expressed as T(n) for some function T on input size n. Empirical comparison is difficult to do “fairly” and is time consuming. Critical resources: Time. Space (disk, RAM). Programmers effort. Ease of use (user’s effort). Factors affecting running time: Machine load. OS. Compiler. Problem size. Specific input values for given problem size.
4
Growth Rate The growth rate for an algorithm is the rate
at which the cost (running time) of the algorithm grows as the size of its input grows. Empirical comparison is difficult to do “fairly” and is time consuming. Critical resources: Time. Space (disk, RAM). Programmers effort. Ease of use (user’s effort). Factors affecting running time: Machine load. OS. Compiler. Problem size. Specific input values for given problem size.
5
Big O Notation Big O Notation หรือ อันดับขนาด (Order of Magnitude)
หมายถึงปริมาณที่เครื่องคอมพิวเตอร์ทำไม่ขึ้นกับขนาดของโปรแกรมหรือ จำนวนบรรทัดของโปรแกรม เป็นฟังก์ชันที่ได้จากการประมาณค่าทางคณิตศาสตร์ซึ่งเป็นฟังก์ชันที่สัมพันธ์กับขนาดของปัญหา ให้ข้อมูลในการเปรียบเทียบอัลกอลิทึม List[1] = 0; List[2] = 0; : List [1000] = 0; Algorithm 1 For(i=1;i<=n;i=i+1) List [i] = 0; Algorithm 2 Algorithm 1 และ Algorithm 2 มีอันดับขนาดคือ O(n) Empirical comparison is difficult to do “fairly” and is time consuming. Critical resources: Time. Space (disk, RAM). Programmers effort. Ease of use (user’s effort). Factors affecting running time: Machine load. OS. Compiler. Problem size. Specific input values for given problem size.
6
Examples of Growth Rate (1)
for (i=1; i<=1000; i=i+1) times <<application code>>; for (i=1; i<=1000; i=i+2) 500 times for (i=1; i<1000; i=i*2) 10 times for (i=1000; i>=1; i=i/2) 10 times Asymptotic analysis is defined for equations. We need to convert programs to equations to analyze them. The traditional notation is (1), not (c). (n) even though the value of sum is n2.
7
Examples (2) for (j=1; j<=10; j=j+1) 10 times
for (i=1; i<=10; i=i+1) 10 iterations <<application code>>; 100 iterations for (j=1; j<=10; j=j+1) 10 iterations for (i=1; i<=10; i=i*2) log210 iterations 10*log210 iterations for (j=1; j<=10; j=j+1) 10 times for (i=1; i<=j; i=i+1) (10+1)/2 times 55 iterations Asymptotic analysis is defined for equations. We need to convert programs to equations to analyze them. The traditional notation is (1), not (c). (n) even though the value of sum is n2.
8
Example: Input Size = n for (i=1; i<=n; i=i+1) T(n) = n
<<application code>>; for (i=1; i<=n; i=i+2) T(n) = n/2 for (i=1; i<n; i=i*2) <<application code>>; T(n) for (i=n; i>=1; i=i/2) Asymptotic analysis is defined for equations. We need to convert programs to equations to analyze them. The traditional notation is (1), not (c). (n) even though the value of sum is n2.
9
Examples (2) for (j=1; j<=n; j=j+1) for (i=1; i<=n; i=i+1)
<<application code>>; for (i=1; i<=n; i=i*2) for (i=1; i<=j; i=i+1) Asymptotic analysis is defined for equations. We need to convert programs to equations to analyze them. The traditional notation is (1), not (c). (n) even though the value of sum is n2.
10
Examples (3) // Find largest value int largest(int array[], int n) {
int currlarge = 0; // Largest value seen for (int i=1; i<n; i++) // For each val if (array[currlarge] < array[i]) currlarge = i; // Remember pos return currlarge; // Return largest } As n grows, how does T(n) grow? Cost: T(n) = c1n + c2 steps
11
Growth Rate Graph
12
Best, Worst, Average Cases
Sequential search for K in an array of n integers: Begin at first element in array and look at each element in turn until K is found Best case: Worst case: Average case: Best: Find at first position. Cost is 1 compare. Worst: Find at last position. Cost is n compares. Average: (n+1)/2 compares IF we assume the element with value K is equally likely to be in any position in the array.
13
Which Analysis to Use? While average time appears to be the fairest measure, it may be difficult to determine. When is the worst case time important? Average time analysis requires knowledge of distributions. For example, the assumption of distribution used for average case in the last example. Worst-case time is important for real-time algorithms.
14
Big-O Notation Big-oh notation represents the growth rate.
It's useful to be able to estimate the cpu or memory resources an algorithm requires. Example: If T(n) = 3n2 then T(n) is in O(n2). Wish tightest upper bound: While T(n) = 3n2 is in O(n3), we prefer O(n2). It provides more information in this example to say O(n2) than O(n3).
15
Big-Oh Examples Example 1: Finding value X in an array
T(n) is in O(n). Example 2: T(n) = c1n2 + c2n T(n) is in O(n2). Example 3: T(n) = c. We say this is in O(1). We are doing average case. cs is a constant. The actual value is irrelevant.
16
Running Time Examples (1)
Example 1: O(1) a = b; This assignment takes constant time, so it is O(1). Example 2: O(n) sum = 0; for (i=1; i<=n; i++) O(n) sum += n; Asymptotic analysis is defined for equations. We need to convert programs to equations to analyze them. The traditional notation is (1), not (c). (n) even though the value of sum is n2.
17
Running Time Examples (2)
Example 3: O(n2) sum = 0; O(1) for (j=1; j<=n; j++) O(n2) for (i=1; i<=j; i++) sum++; for (k=0; k<n; k++) O(n) A[k] = k; First statement is (1). Double for loop is i = (n2). Final for loop is (n). Result: (n2).
18
Running Time Examples (3)
Example 4: O(n2) sum1 = 0; O(1) for (i=1; i<=n; i++) O(n2) for (j=1; j<=n; j++) sum1++; sum2 = 0; O(1) for (j=1; j<=i; j++) sum2++; sum is n2. First loop, sum is n2. Second loop, sum is (n+1)(n)/2. Both are (n2). sum is (n+1)(n)/2.
19
Running Time Examples (4)
Example 5: O(nlogn) sum1 = 0; for (k=1; k<=n; k*=2) O(logn) for (j=1; j<=n; j++) O(n) sum1++; sum2 = 0; for (k=1; k<=n; k*=2) O(n) for (j=1; j<=k; j++) sum2++; First loop is n for k = 1 to log n, or (n log n). Second loop is 2k for k = 0 to log n - 1, or (n). First loop is n for k = 1 to log n. Second loop is 2k for k = 0 to log n.
20
Algorithm Efficiency Constant O(1) การข้าถึงสมาชิกตัวที่ iในแถวลำดับ
Logarithmic O(logn) การค้นหาแบบBinary Search Linear O(n) การค้นหาแบบSequential Search Linear logarithmic O(n(logn)) การจัดเรียงแบบMerge Sort Quadratic O(n2) การจัดเรียงแบบธรรมดา How much speedup? 10 times. More important: How much increase in problem size for same time expended? That depends on the growth rate. n: Size of input that can be processed in one hour (10,000 steps). n’: Size of input that can be processed in one hour on the new machine (100,000 steps). Note: for 2n, if n = 1000, then n’ would be 1003.
21
Algorithm Efficiency Assumes instruction speed of one microsecond and 10 instructions in loop. Efficiency Big-O Iterations Est. Time Logarithmic Linear O(logn) O(n) 14 10,000 Microseconds .1 seconds Linear logarithmic Quadratic O(n(logn) O(n2) 140,000 10,0002 2 seconds 15-20 min. Polynomial O(nk) 10,000k Hours Exponential O(cn) 210,000 Intractable Factorial O(n!) 10,000! How much speedup? 10 times. More important: How much increase in problem size for same time expended? That depends on the growth rate. n: Size of input that can be processed in one hour (10,000 steps). n’: Size of input that can be processed in one hour on the new machine (100,000 steps). Note: for 2n, if n = 1000, then n’ would be 1003.
22
Faster Computer or Algorithm?
Old machine can run 10,000 basic operations/hour. New machine can run 100,000 basic operations/hour. What happens when we buy a computer 10 times faster? T(n) n n’ Change n’/n 10n 1,000 10,000 n’ = 10n 10 20n 500 5,000 5n log n 250 1,842 10 n < n’ < 10n 7.37 2n2 70 223 n’ = 10n 3.16 2n 13 16 n’ = n + 3 ----- How much speedup? 10 times. More important: How much increase in problem size for same time expended? That depends on the growth rate. n: Size of input that can be processed in one hour (10,000 steps). n’: Size of input that can be processed in one hour on the new machine (100,000 steps). Note: for 2n, if n = 1000, then n’ would be 1003.
งานนำเสนอที่คล้ายกัน
© 2024 SlidePlayer.in.th Inc.
All rights reserved.