Not bad/bad practice to use wildcards (*
). Depends on your goal.
I want to be a good programmer (probably your case)
A good programmer knows that or he will need all columns of a row (for example, in a "show all data" style listing) and will use *
or you’ll use one or only a few (in a subquerie, for example) and will select ONLY those you need.
As a good programmer, you know which columns will be added to your tables, and you know the only way to make your system scalable - running without changes - is to use a wildcard (*
). A framework In general, it uses BD metadata to find out which columns exist, or even has it in its settings. But this information is cached by someone, whether on a host running the applications, such as PHP, or the BD host. That task always stays with someone.
But in joins, where we join different tables, columns may appear with equal names (now and in the future), and the best way to do this is by specifying column-by-column, as we will use different "nicknames" for columns with the same name.
Short money
Well, you’re either someone who’s responsible for the project budget, or you’re aware that you should minimize money spent on the comic. There are servers that limit traffic in terms of numbers of darlings, and others limiting by amount of data trafficked. In general, you will only pay attention to *
when you can’t traffic too many bits, or when the requirements survey step (pre-programming steps) established that there will be too many BD records or there will be intense network use.
Ah, if there is heavy network traffic, the number of darlings becomes a problem, because you save a lot on traffic data, but you don’t have much to do with multiple/concurrent access (hence you already enter into theories CHAP/ACID, but from there already escapes from the scope of basic programming and in general do not use Sgbds and use Nosql).
The price of the Internet is getting lower, and prices are falling - from BD servers; that way, you will less and less worry about how your system will access the BD and more on how it presents itself to the user.
Well, I want my client (or boss/my company) in my hand (you won’t be able to).
Well, in the past, some programmers had the bad faith to ignore the principles of programming and wanted to "keep the job/client". They thought that hindering changes in the system would be the way, and so they never/little used wildcard (*
). Because this has changed a lot: what matters is the data that is in the comic book, and not the system that uses it.
If you study a little, you’ll see that it’s easy to find out everything about a comic - which you didn’t do - and that way you’ll know how the system should work. This is called reverse engineering. And yet, seeing the system working, there is reengineering, which is to make one system emulating another.
I mean, there’s no longer any dependence on the guy who created the query without wildcard, and you will be fired or will tarnish your reputation in the market with other customers.
In short
There is no foundation in this note currently (before, in the era of chipped bit, yes), and you will probably always use wildcard, except in joins, where columns with equal names can appear from different tables and treat this is more difficult.
I think the people who "negatively" didn’t understand. It all depends on the goal, and it can change. Today, I delivered a system to the customer and he is 100% suited to what he asked. But tomorrow it may not be (customer requests maintenance). If we always think of "a system prepared for the future", we will fill the solution with theorems, creating a heavy, large and complex software, which will give a negative note to you today. A robust software should be used for problems that require robustness, but not for cases where most likely the system will not have greater needs.
Briefly it is for you not to get more data than you need, if you want only the value
id
, no need to bring together another 50 columns.– Kazzkiq