Not.
This is a common doubt and error. HTML is a markup language as its name says. Hyper Text Markup Language. More broadly can be considered a data language. HTML only encapsulates data and describes what to do with them, not how to do. Is defined in Wikipedia in English.
A web browser can read HTML files and Compose them into Visible or Audible web pages. The browser does not display the HTML tags, but uses them to interpret the content of the page. HTML describes the Structure of a website semantically along with cues for Presentation, making it a Markup language rather than a Programming language.
It is not programming language because it is not Turing complete. That is, it would need to have some specific characteristics to be able to "program a device". You cannot run HTML, that’s why you need Javascript to do some things, this is a programming language. Both can even be considered computational languages, but so on.
Semantics: you program and/or encode in Javascript but only encode in HTML (you create a code that declares a form of presentation).
Turing machine
HTML is far from having Turing completeness. It would need to have all these capacities:
- do calculations;
- change information contained in some type of memory;
- make decisions;
- change the execution flow.
Alan Turing created a minimum theoretical machine which allows to do these operations and has passed to consider that any programming language needs to be able to do all the operations of this universal machine.
These non-programming languages help instruct devices (computers with software, for example) to perform some task, but a markup language has its ability to do this very limited, it cannot do all Turing machine operations.
Besides, a markup language alone does nothing. In theory you could even create a machine that would "understand" a markup language and do something but doubt that this would have any real utility. In practice, in today’s technology, these languages only work because they are interpreted by software that has been created with... programming languages.
Why do we adopt the Turing machine as a parameter to define what a programming language is? You need to establish a cut, give a line where you have each thing, otherwise anything could be a programming language, Even the ASCII table can be considered so because there is a set of rules and their use instructs in a limited way how a computer should operate. One could argue that the definition is arbitrary, but it makes sense, is useful for classification, was invented early in modern computing and brings no problems.
Can you invent another criterion? Of course you can, but for what? There are those who claim that 1 plus 1 is not 2. It may be true, but what gain will we have in changing this concept? As exercise of thought can be interesting, but has no practical motivation, so who insists on it borders on insanity.
The classification of languages that is made today serves the community well. People who want to question are welcome, but if the person wants to change what has been established and establish a new standard, they need to convince people, give concrete data, show serious research that changes this and give motivation for change. This is different from just correcting a misconception on the part of some things and what official documents say otherwise, as I did in What is the difference between attribute and field in the classes?.
Other languages
In general languages ending with ML are markup languages, see XML. Of course this is just an initial idea, there is a programming language called ML (Meta Language) and several derivatives of them (SML, CAML, OCAML, etc.). Although it is not in the name of most programming languages the acronym PL ends up being associated with them somehow. A case where the name has the acronym is the PL/SQL which is obviously a programming language.
So SQL must be a programming language, right? No, at least in its ANSI version it is not Turing complete. The extensions provided by the main database systems in the market, mainly to work with stored procedures, are more complete and can be considered Turing equivalent (only an alternative name/synonym).
CSS is a style language and is also not a programming language.
Already XSLT (Extensible Stylesheet Language Transformations) is a declarative language like most markup languages but is a programming language since it allows all computations of the abstract Turing machine.
Both programming and marking languages can be declarative or imperative. They are different concepts. There may be confusion because it is more common for Pls to be imperative and Mls to be declarative.
Moon is a PL that is used as marking in some situations. Of course it is the use of only one subset, but it can be used as a form much like JSON which is also Javascript-based ML which is a PL. Surely there are other languages that have a syntax that make it easy to use as a markup language.
If you consider HTML5+CSS3 as a single language then it becomes more complicated to say if it can be considered a programming language. Surely it would be a very strange PL to use but it can do all minimal computations, right? See the Rule 110.
Why knowing this is important?
Because computing is an exact science, when you don’t care about the accuracy of things you’re on the wrong track to accomplish this work. Besides knowing how to communicate better with your peers (including here), there is an important cognitive effect when you try to do and define things in the right way. You are a reflection of what you practice. No one is muscular without doing physical effort (nor taking pump). No one will "communicate with the computer" properly if they do not exercise precise communication at all in life. I’m not saying that you need to know everything, that you can’t make a mistake, which would be naive, but you need to push yourself and keep this in mind. I don’t know a buzzword in the brain that turns on or off the ability and need for correct communication, I just wanted to have a compiler to help me point out my mistakes when I’m not programming :) .
Look I believe that not for all I read in the answers here and in the links indicated, but I would not consider a "mistake" put in the curriculum on the part of Programming Languages because if we who are in the IT area are discussing it is certain that a HR will not have this certainty and for lack of this information may find that you are unaware of this "markup language" and get you out of a possible interview.
– SneepS NinjA