Sequencing Problem

本文详细介绍了插入排序算法的原理、伪代码实现及时间复杂度分析,并通过Python代码展示了其实现过程;同时,解释了归并排序的递归思想、时间复杂度及其与插入排序的对比。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Preface

Yesterday, I had learned lesson one about Introduction to Algorithms from open.163.com . Thanks for NetEase And MIT.


Insertion Sort

We start with insertion sort , which is an efficient algorithm for sorting a small number of elements.

pseudocode:

INSERTION-SORT(A)

for j <- 2 to length[A]
    do key <- A[j]
       ↓Insert A[j] into the sorted sequence A[1...j-1]
       i <- j-1
       while i>0 and A[i]>key
           do A[i+1] <- A[i]
               i <- i-1
        A[i+1] <- key

Loop invariants and the correctness of insertion sort

We state these properties of A[1…j-1]( the elements in A[1...j-1] had been sorted A[1…j-1] are the elements originally in positions 1 through j-1 ,but now in sorted order ) formally as a loop invariant
We use loop invariants to help us understand why an algorithm is correct. We must know three things about a loop variant:
+Initialization: It is true prior to the first iteration of the loop.
+Maintenance: If it is true before an iteration of the loop, it remains true before next iteration.
+Termination: When the loop terminates, the invariant gives us a useful property that helps us show that the algorithm is correct.

——INTRODUCTION TO ALGORITHMS(Second Edition)



Time(n)=i=2nθ(j)


Maybe you do not know what θ ( theta ) means. There is a BIG IDEA OF ALGORITHM,to be able to take what is apparently a really messy, complicated situation and reduce it to being able to do some mathematics. And that idea is called asymptotic analysis. It is to ignore machine-dependent constants and instead of the actual running time, look at the growth of the running time.

Asymptotic notation
theta notation θ ( weak notation )

Describtion: from a formula, drop low order terms and ignore leading constants.
ex: 3n3+90n25n+6046 —> θ(n3)


Now, we can talk about this formula.

Time(n)=i=2nθ(j)

i <- j-1 through this loop from 1 ~ i, and it is doing just a constant amount of stuff for each step of the value of i. (get a value from list[i] is a constant amout of stuff.)
j=cn ——– c: constant / n:times
ex:
if i=4

 while i>0 and A[i]>key
          do A[i+1] <- A[i]
               i <- i-1

do A[i+1] <- A[i] should be done four times, each time cost a constant amout of stuff.

(fourtimes)(constant)>j=cn


Time(n)=i=2nθ(j)j=cn


Time(n)=i=2nθ(j)=θ(n2)

Actually, this is a Arithmetic Series.

In mathematics, an arithmetic progression (AP) or arithmetic sequence is a sequence of numbers such that the difference between the consecutive terms is constant. For instance, the sequence 5, 7, 9, 11, 13, 15 … is an arithmetic progression with common difference of 2.

WikiPedia

theta n^2 n^3 Time

We can get some information from this diagram(sorry, ugly, but just see).
While n=n0 , θ(n3) and θ(n2) use the same amount of time. But when n>n0 , it is obvious that θ(n2) is faster.
This is the reason that why asymptotic analysis is a BIG IDEA OF ALGORITHM

There is a BIG IDEA OF ALGORITHM,to be able to take what is apparently a really messy, complicated situation and reduce it to being able to do some mathematics.
It is to ignore machine-dependent constants and instead of the actual running time, look at the growth of the running time.

Asymptotic analysis will throughout this blog.

—————————

Insertion Sort (Python)

def insertion_sort(list):
    for i in range(2,len(list)):# for(i=2;i<len(list);i++): wrong !
        list[0] = list[i]# save current value
        if(list[i] < list[i-1]):# list has been sorted befor i 
            for j in range(1,i):
                if(list[0] < list[j]):# find the right index to insert
                    break
            for k in range(j,i)[::-1]:# move back the value where index[j,i] one size
                                      # range(j,i)[::-1] can sort [1,2,3,4] to [4,3,2,1]
                list[k+1] = list[k]
            list[j] = list[0]# change value(in order to sort list)
    return list
A = [0,2,8,2,6,4,3,9,1,3,5,6,2,3,4,6,8,2,3,1]# demo list
print insertion_sort(A)

Merge Sort

θ(1) 1. If n=1, done
2T(n) 2. Recursively sort -> A[1,n/2],A[n/2+1,n]
θ(n) 3. Merge 2 sorted arrays

θ(1) — a constant amout of stuff

——————————

20 12
13 11
7 9
2 1
choose the smaller one
1,
1,2,
1,2,7, -> new array

——————————

every step here is some fixed number of operations that is independent of the size of the arrays at each step

From above computation, it uses the time that can be express as θ(n) .(The time actually go through this and merge two arrays is order n.)(call this linear time.)

Recurrence:

T(n)={θ(1),if(n=1)(usuallyomit)2T(n/2)+θ(n),if(n>1)

then, we use the recursion tree technique to compute.

T(n)=2T(n/2)+cn

θ(n) just like a constantn | C is constant


h(high) = the number of n reduce by half until to 1 is lgn || leaves = n

—————————

T(n) =cnlgn+θ(n)=θ(nlgn)

θ(nlgn) is asymptotically faster than θ(n2) . We can find some n=n0 to accomplish(just like a diagram above which compare θ(n2) to θ(n3) ).

Merge Sort

#include <stdio.h>

#define LEN 8
int a[LEN] = { 5, 2, 4, 7, 1, 3, 2, 6 };

void merge(int start, int mid, int end)
{
    int n1 = mid - start + 1;
    int n2 = end - mid;
    int left[n1], right[n2];
    int i, j, k;

    for (i = 0; i < n1; i++) /* left holds a[start..mid] */
        left[i] = a[start+i];
    for (j = 0; j < n2; j++) /* right holds a[mid+1..end] */
        right[j] = a[mid+1+j];

    i = j = 0;
    k = start;
    while (i < n1 && j < n2)
        if (left[i] < right[j])
            a[k++] = left[i++];
        else
            a[k++] = right[j++];

    while (i < n1) /* left[] is not exhausted */
        a[k++] = left[i++];
    while (j < n2) /* right[] is not exhausted */
        a[k++] = right[j++];
}

void sort(int start, int end)
{
    int mid;
    if (start < end) {
        mid = (start + end) / 2;
        printf("sort (%d-%d, %d-%d) %d %d %d %d %d %d %d %d\n", 
               start, mid, mid+1, end, 
               a[0], a[1], a[2], a[3], a[4], a[5], a[6], a[7]);
        sort(start, mid);
        sort(mid+1, end);
        merge(start, mid, end);
        printf("merge (%d-%d, %d-%d) to %d %d %d %d %d %d %d %d\n", 
               start, mid, mid+1, end, 
               a[0], a[1], a[2], a[3], a[4], a[5], a[6], a[7]);
    }
}

int main(void)
{
    sort(0, LEN-1);
    return 0;
}

via tsinghua.mergesort

UpdateLog



2015/12/21

—————————

Update : Insertion Sort & Merge Sort

资源下载链接为: https://pan.quark.cn/s/9e7ef05254f8 在网页设计中,为图片添加文字是一种常见的需求,用于增强视觉效果或传达更多信息。本文将介绍两种常用的方法:一种是将图片设置为背景并添加文字;另一种是利用<span>标签结合CSS定位来实现。 这种方法通过CSS实现,将图片设置为一个容器(通常是<div>)的背景,然后在容器中添加文字。具体步骤如下: 创建一个包含文字的<div>元素: 使用CSS设置<div>的背景图片,并调整其尺寸以匹配图片大小: 如有需要,可使用background-position属性调整图片位置,确保文字显示在合适位置。这样,文字就会显示在图片之上。 另一种方法是将文字放在<span>标签内,并通过CSS绝对定位将其放置在图片上。步骤如下: 创建一个包含图片和<span>标签的<div>: 设置<div>为相对定位,以便内部元素可以相对于它进行绝对定位: 设置<span>为绝对定位,并通过调整top和left属性来确定文字在图片上的位置: 这种方法的优点是可以精确控制文字的位置,并且可以灵活调整文字的样式,如颜色和字体大小。 两种方法各有优势,可根据实际需求选择。在实际开发中,还可以结合JavaScript或jQuery动态添加文字,实现更复杂的交互效果。通过合理运用HTML和CSS,我们可以在图片上添加文字,创造出更具吸引力的视觉效果。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值