0
I have a question about how to get the run time only in the sort algorithm. I searched a lot on the Internet and found a lot of superficial stuff, nothing that would help me in what I need.
I have a question where I have to analyze the time it takes for the algorithm to only sort an external file (in this case I’m using random 1-1000 numbers and no repetition in the.txt test). The code is running, but every time I try it with a test.txt file that contains disordered values below 1000 characters, the runtime is reset. When a colleague talked to the teacher who was testing with 10,000 characters, he smiled and talked to testing for 10, 20, 50, 100, 150, 200... and so on. Is there any way to count the time it takes for the algorithm to run so that it works on both low inputs and larger inputs?
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#include <sys/time.h>
#define TAM 1000
void SelectionSort_1();
int main (){
int iCont, jCont, aux = 0, vetor[TAM];
FILE *ent;
ent = fopen("teste.txt", "r");
if(ent == NULL){
printf("Erro! Nao consegui abrir o arquivo...\n");
exit(1);
}
for(iCont = 0; iCont < TAM; iCont++){
//printf("Lendo posicao %d\n", iCont);
if(fscanf(ent, "%d", &vetor[iCont]) == EOF){
setbuf(stdin, NULL);
printf("Opa, tentei ler alem do final do arquivo...\n");
break;
}
}
SelectionSort_1(vetor);
fclose( ent );
printf("\n\nOrdenado: \n\n");
for(iCont = 0; iCont < TAM; iCont++){
printf("%d ", vetor[iCont]);
}
printf("\n\n\n\n");
return 0;
}
void SelectionSort_1(int vetor[]){
int iCont, jCont, min, aux = 0;
struct timeval tv1, tv2;
gettimeofday(&tv1, NULL);
for(iCont = 0; iCont < TAM - 1; iCont++){
min = iCont;
for(jCont = iCont + 1; jCont < TAM; jCont++){
if(vetor[jCont] < vetor[min])
min = jCont;
}
if(vetor[iCont] != vetor[min]){
aux = vetor[iCont];
vetor[iCont] = vetor[min];
vetor[min] = aux;
}
}
gettimeofday(&tv2, NULL);
printf ("Total time = %.8f seconds\n",
(double) (tv2.tv_usec - tv1.tv_usec) / 1000000 +
(double) (tv2.tv_sec - tv1.tv_sec));
}
Do you speak of asymptotic time? Which lets you know the running time of the
selection sort
will grow according to the square of the entrance? That of theinsertion sort
is quadratic in the middle case but linear in the best case? Than themerge sort
will always executen log n
footsteps?– Jefferson Quesado
@Jeffersonquesado did not understand 100% what you said, because in another matter, I am learning now about asymptotic analysis, but what I need to know is how long the Bubble/Selection/merge.. took long to sort the shuffled file. Thank you.
– Adriel Gama
Then just call the function that takes (in Milis) the system time or the time the program is in the air. One before ordering, another after, and it makes a difference. I don’t remember what the name of the function is, but I think it’s the
time
oftime.h
– Jefferson Quesado
I’ll do a good search then about the time library. The bad thing is that in Portuguese it is a little difficult to find, but since there is no way and I need to work English, let’s go to gringa.. kkk Thanks for the tip, if you get put here the result!!
– Adriel Gama
just don’t forget that it will be an answer :-) We didn’t answer our questions in the questions. Some people forget that, so I decided to leave the alert
– Jefferson Quesado
Examples of self-answered: https://answall.com/q/311780/64969 ; https://answall.com/q/314163/64969
– Jefferson Quesado
Thank you!! kkk will use the reply button just below. : D
– Adriel Gama