The "computer" can't tell diddly about what the bit pattern represents. It just does things to 1' and 0's.
The programming language keeps track of what each group of 1/0's are for. The language can be assembly (lowest level programming language, close to the 1/0's in the machine), Basic, Pascal, C (and dozens of others that are dead or dying).
Each language handles the characters, numbers, strings (of letters) a little differently so when looking at the 1/0's you need to have an idea of what language the code was written in.
Sometime you can open the program file in notepad and see strings of text (these are in ascii text format that is a standard across all compuer languages). Never save a program file you open with any kind of word processor!!!!!! It may be corrupted.
Ascii uses 8 bit data groups allowing 256 characters. The first 128 are well defined (numbers, letters, punctuation and some control characters that go back to the old teletype terminals) and the last 128 can vary with the application. Note The ascii charaters do not correspond directly to anything. i.e. 0 is not 0, etc.
And to answer your question the programming language has a compiler and linker that change the instructions you write into machine instructions, text, numeric data, etc and put it all in a format the machine can read and execute.
Some of the new web pseudo languages like HTML, JAVA, etc use interpreters that take the programming instructions directly or precompile them into an intermediate language (used to be called pseudo code) that is easier for machines to use. In this case each machine must have the final compiler installed on it so it can take the code on the fly and interpret it. This kind of compiling is inherently slower but allows for compatibility between machines (same web pages runs (almost) the same way in IE, Netscape or on a Mac)
Well that ought to keep you busy till the turkeys done