-1
I have the following C code, I am compiling it on linux using gcc and -lm (see below)
What happens is the following: f(x+dx) - f(x) gives zero ( which is not! ) The functions f(x+dx) and f(x) are being calculated right, even had a print to see and the difference in it does not exceed the available accuracy (eg. dx very small and f(x+dx)-f(x) = 0.0 ).
I’d like to understand what’s going on! I tried to use 2.0 (not to be an integer), tried to put the values in variables and pass the variables (not directly in Return or calculate the function when giving the print), and several other things. What I’m doing wrong?
#include <stdio.h>
#include <math.h>
double f(double x){
double r=0;
/**/
printf("\n x= %f, f(x) = %f\n",x, pow(x,2)+6*x);
/**/
r = pow(x,2)+6*x;
return r;
}
double derivatef(double x, double){
return (( f(x+dx) - f(x) ) / dx);
}
void main(){
double f_x, df_x; //d = derivada;
/*
1- Usando o método da derivada à direita,
calcule f'(x) nos pontos x=-2 e 2.
Como você sabe se o valor encontrado é
correto sem calcular o valor analítico de f'(x) ?
*/
f_x = f(2);
df_x = derivatef(2, 0.001);
printf("x=2, f(x) = %f, f'(x) = %f", f_x, df_x);
}
This code has several syntax errors and does not even compile.
– Maniero
I was able to compile and there was no syntax error. What I found strange was that I was getting zero as a result of subtraction. What I missed was the time to declare the function (I didn’t use double before to define her type.) My teacher emailed me that now. I even wanted to erase the question but I couldn’t find where.
– Alecks Rolf
So your compiler is crazy. http://ideone.com/pVLKM3
– Maniero
Probably what made me spend more time trying to fix the error. I did not define the type for programming addiction in another language. I was looking for another mistake because this kind of statement was natural to me. Thank you.
– Alecks Rolf
@Alecksrolf: lost some characters in the derived function definition:
double dx){
– JJoao