1
I need to use Python to make several commands on a system, and at the same time go saving in a dataframe everything that is being done. There will be hundreds of commands per minute, for several hours, daily.
In that situation, how would you deal with the risk of problems during implementation that terminate the application abruptly? (e.g.: power outage, computer crash, miscellaneous bugs, etc.)
Please correct me if I’m wrong, but I believe that when I add data to a dataframe in the Pandas library, however large the dataframe is, it is still just a variable during execution, and if there is a problem the variable is totally lost along with its data? Or it is stored in another location?
I thought about saving in excel... but in this situation, with this data volume and processing speed, add each command executed immediately in an Excel file, and save it to each new information added, it would be feasible?
Or would saving to SQL Server database be more appropriate? I think that this volume of data would overload the network daily, and there may be better alternatives.
And then, someone knows a better alternative?
What would these commands be saved for? A string that was typed in a cmd?
– LeandroHumb
Leandro, exactly. Strings of a cmd-like system, both captured and sent would be saved.
– GDVP