Calculate execution time of a sort algorithm in C

Asked

Viewed 2,428 times

0

I have a question about how to get the run time only in the sort algorithm. I searched a lot on the Internet and found a lot of superficial stuff, nothing that would help me in what I need.

I have a question where I have to analyze the time it takes for the algorithm to only sort an external file (in this case I’m using random 1-1000 numbers and no repetition in the.txt test). The code is running, but every time I try it with a test.txt file that contains disordered values below 1000 characters, the runtime is reset. When a colleague talked to the teacher who was testing with 10,000 characters, he smiled and talked to testing for 10, 20, 50, 100, 150, 200... and so on. Is there any way to count the time it takes for the algorithm to run so that it works on both low inputs and larger inputs?

#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#include <sys/time.h>

#define TAM 1000

void SelectionSort_1();


int main (){        

    int iCont, jCont, aux = 0, vetor[TAM];

    FILE *ent;

    ent = fopen("teste.txt", "r");

    if(ent == NULL){
        printf("Erro! Nao consegui abrir o arquivo...\n");
        exit(1);
    }

    for(iCont = 0; iCont < TAM; iCont++){
        //printf("Lendo posicao %d\n", iCont);
        if(fscanf(ent, "%d", &vetor[iCont]) == EOF){
                setbuf(stdin, NULL);
            printf("Opa, tentei ler alem do final do arquivo...\n");
            break;
        }
    }

    SelectionSort_1(vetor);


    fclose( ent );

    printf("\n\nOrdenado: \n\n");
    for(iCont = 0; iCont < TAM; iCont++){
        printf("%d ", vetor[iCont]);
    }

    printf("\n\n\n\n");

    return 0;
}

void SelectionSort_1(int vetor[]){ 
    int iCont, jCont, min, aux = 0;

struct timeval  tv1, tv2;
gettimeofday(&tv1, NULL);

    for(iCont = 0; iCont < TAM - 1; iCont++){
        min = iCont;
        for(jCont = iCont + 1; jCont < TAM; jCont++){
            if(vetor[jCont] < vetor[min]) 
                min = jCont;
        }
    if(vetor[iCont] != vetor[min]){
        aux = vetor[iCont];
        vetor[iCont] = vetor[min];
        vetor[min] = aux;
     }
  }

gettimeofday(&tv2, NULL);
printf ("Total time = %.8f seconds\n",
         (double) (tv2.tv_usec - tv1.tv_usec) / 1000000 +
         (double) (tv2.tv_sec - tv1.tv_sec));
}
  • Do you speak of asymptotic time? Which lets you know the running time of the selection sort will grow according to the square of the entrance? That of the insertion sort is quadratic in the middle case but linear in the best case? Than the merge sort will always execute n log n footsteps?

  • @Jeffersonquesado did not understand 100% what you said, because in another matter, I am learning now about asymptotic analysis, but what I need to know is how long the Bubble/Selection/merge.. took long to sort the shuffled file. Thank you.

  • Then just call the function that takes (in Milis) the system time or the time the program is in the air. One before ordering, another after, and it makes a difference. I don’t remember what the name of the function is, but I think it’s the time of time.h

  • 1

    I’ll do a good search then about the time library. The bad thing is that in Portuguese it is a little difficult to find, but since there is no way and I need to work English, let’s go to gringa.. kkk Thanks for the tip, if you get put here the result!!

  • just don’t forget that it will be an answer :-) We didn’t answer our questions in the questions. Some people forget that, so I decided to leave the alert

  • Examples of self-answered: https://answall.com/q/311780/64969 ; https://answall.com/q/314163/64969

  • 1

    Thank you!! kkk will use the reply button just below. : D

Show 2 more comments

1 answer

1

It has the clock() function of the timer library. h, with it you get the amount of processor clocks :)

To use is easy, first you capture the clock before starting the sorting and then just when finishing the sorting, and to get the time between them you subtract the clocks and divided by the CLOCKS_PER_SEC constant, very similar to this code.

/* clock example: frequency of primes */
#include <stdio.h>      /* printf */
#include <time.h>       /* clock_t, clock, CLOCKS_PER_SEC */
#include <math.h>       /* sqrt */

int frequency_of_primes (int n) {
  int i,j;
  int freq=n-1;
  for (i=2; i<=n; ++i) for (j=sqrt(i);j>1;--j) if (i%j==0) {--freq; break;}
  return freq;
}

int main ()
{
  clock_t t;
  int f;
  t = clock();
  printf ("Calculating...\n");
  f = frequency_of_primes (99999);
  printf ("The number of primes lower than 100,000 is: %d\n",f);
  t = clock() - t;
  printf ("It took me %d clicks (%f seconds).\n",t,((float)t)/CLOCKS_PER_SEC);
  return 0;
}

Browser other questions tagged

You are not signed in. Login or sign up in order to post.