Assembly Language And Computer Architecture

  • Thread starter Thread starter aryan185
  • Start date Start date
  • Replies Replies 14
  • Views Views 6,393

aryan185

Newbie
Messages
2
Location
NA
Hi people,I really want to know about a good book on 8086 assembly language. A book which will make a dumbest person like me understand it and master it. I also want want to find a good book on computer architecure.I am in delhi and i am open for a person to teach me throughout the MCA course, i am willing to pay for it too. If anyone is interested please give me the contact details.Thanks
 
^ are you on MCA correspondence course ?if you are on a regular MCA course then bestGuidance can be given by your Faculty .you need to put your details (with mail IDs)in your profile sothat intersted members cancontact you . Personal Messaging at presentis not allowed for a newbie . And it is not safeto post the contact details .
 
Instead of looking for a book, try looking for a "problem" that you can solve with 8086 assembly. For example, try creating a TSR program that pressses F2 every 10 seconds so that you never loose your work while editing things in Turbo Pascal / C++.
 
I am doing my MCA from IGNOU, so clearly guidance is a big problem, since no regular study and faculty.people who are interested can contact me from my profile.
 
^ if you need face to face teaching then contact a MCA completedperson / senior near you . The councellor at your IGNOU study center can guide you . And you may advertise for a MCA tutor inLocal News Papers . ______________________________________________________[OT] I'd suggest to remove your mail ID at the above post and put it in your profile .
 
Never thought some one in this forum would be interested in assembly language. If some one is interested, these are the things you need to write a working win32 program in assembly language:

1. masm32.com -
Microsoft Macro Assembler and Linker. Even MSVS2005 uses this (one linker to link them all).

2. Iczelion's Win32 Assembly Homepage -
The one who taught me assembly language, and taught it well.

3. WinAsm Studio, The Assembly IDE - Free Downloads, Source Code -
For the beginner and the nerd; an IDE to help you write assembly programs, assemble it and link it.

4. Intel® 64 and IA-32 Architectures Software Developer's Manuals -
Detailed information on every instruction supported by every x86/x64 intel processor ever released. You can write your own assembler/ disassembler using this information (and some information on the Microsoft PE/COFF formats).

5. OllyDbg v1.10 -
The best 32-bit debugger on the planet (C/ C++ programmers, go away).

6. DataRescue Home Page : home of the IDA Pro Disassembler and of PhotoRescue -
The best disassembler on the planet. Demo version is available which is more than enough for normal use.

You don't need anything else.
 


dear house, clearly you've missed the date of this thread. Its MORE THAN A YEAR OLD 😛
 
First read GAONKAR's book on microprocessor(8085).You can then go for Doughlas Hall's microprocessor- TATA MC GRAW HILL.Brey's microprocessor book from PRENTICE HALL INDIA is also good.All these are printed in india n so cost is ok
 
the thread starter ashish has posted only twice & that too in this thread so i think he must have stopped visiting this site.anyways but @topic i studied 8086 from john uffenbeck (hope the spelling is right) & i found it to be very good& for assembly programming masm (refer gregory house's post)
 
Thanks for your help. OK I will borrow Uffenbeck's book from my friend. Yes i believe it will be good because my friend bought it last year.

I am doing some self reading on Computer Architecture by reading Morris Mano's book "Computer System Architecture", Third Edition, Prentice Hall.
May I know what book will you suggest me to read to understand what actually takes place inside a computer? I want to go deeply and understand the basics.
Sample question-1
e.g. I type this text using a keyboard. Then it gets stored in memory. But how it turn into zeroes and ones? I get to understand from the texbook that each flip flop stores one bit each. And so alphabet "A" becomes a string of many zeroes and ones, and all the bits get stored in a register.
But how does that convertion from character "A" to a 'string of many zeroes and ones' takes place? Who does this job? And how is it done?
How is the storing of the 'string of zeroes and ones' take place?

Sample question-2
How are the lines of code (in e.g. the C programming language) get stored in the memory? Is it the same way as the text file of e.g. MS Word?

Sample question-3
What happens when the lines of code (in e.g. the C programming language) gets executed? What really happens at each line of code?

Sample question-4
What happens when the lines of code (in e.g. the Assembly language used by Morris Mano in his basic computer) gets executed? What really happens at each line of code?

Sample question-5
I read about microoperations? But what are they exactly? I get to understand that e.g. AC+1 means the accumulator gets incremented. But what happens before the computer gets that final sum.

Sample question-6
Do we still use microprogram in computers with a Pentium cpu? Some writers say that the age of microprogram is over. Is that true? If it is so, then are all modern computers designed as with a hardwired control unit? Then is it useful at all to read about microprogram at this day and age?

Sample question-7
Is it still useful for me to learn assembly language these days? If yes, then what must I learn? Is it MASM, TASM, or are there other useful ones?

I have tried reading the following books:-


1. WILLIAM STALLINGS- COMPUTER ORGANIZATION and ARCHITECTURE
2. ZAKY, VRANESIC, HAMACHER-COMPUTER ORGANIZATION
3. HENNESSY AND PATTERSON- COMPUTER ARCHITECTURE and DESIGN- A HARDWARE-SOFTWARE INTERFACE


4. JOHN P. HAYES- COMPUTER ARCHITECTURE AND ORGANIZATION
5. VINCENT HEURING and JORDAN- COMPUTER ARCHITECTURE / ORGANIZATION
6. ANDREW S. TANENBAUM- STRUCTURED COMPUTER ORGANIZATION

and many more books. I also read wikipedia.
But till now i cannot comprehend what takes exactly place inside the machine. I can understand that we use gates, flip flops, registers, counters etc.
It is like this- I understand that the potato is getting gets cooked when placed in the water that gets boiled for some time. But what happens to that potato during that time of boiling? Why does it become soft? The potato was hard initially. How does the change takes place? Why does the skin peel off? Why does the taste change? Now after being cooked, it is tasty- but a raw potato is not. So what happens to those chemicals inside the potato?
So I understand that the accumulator gets incremented e.g. from zero to 100. The final "picture" is OK. What is the "boiling process" that goes on while the program is being executed? What does the "boiling" do to the text file before it finally gets stored? What happens before I get that yuummy potato or suuccessfully-run program?

I am doing some self reading on Computer Architecture by reading Morris Mano's book "Computer System Architecture", Third Edition, Prentice Hall.
May I know what book will you suggest me to read to understand what actually takes place inside a computer? I want to go deeply and understand the basics.
Sample question-1
e.g. I type this text using a keyboard. Then it gets stored in memory. But how it turn into zeroes and ones? I get to understand from the texbook that each flip flop stores one bit each. And so alphabet "A" becomes a string of many zeroes and ones, and all the bits get stored in a register.
But how does that convertion from character "A" to a 'string of many zeroes and ones' takes place? Who does this job? And how is it done?
How is the storing of the 'string of zeroes and ones' take place?

Sample question-2
How are the lines of code (in e.g. the C programming language) get stored in the memory? Is it the same way as the text file of e.g. MS Word?

Sample question-3
What happens when the lines of code (in e.g. the C programming language) gets executed? What really happens at each line of code?

Sample question-4
What happens when the lines of code (in e.g. the Assembly language used by Morris Mano in his basic computer) gets executed? What really happens at each line of code?

Sample question-5
I read about microoperations? But what are they exactly? I get to understand that e.g. AC+1 means the accumulator gets incremented. But what happens before the computer gets that final sum.

Sample question-6
Do we still use microprogram in computers with a Pentium cpu? Some writers say that the age of microprogram is over. Is that true? If it is so, then are all modern computers designed as with a hardwired control unit? Then is it useful at all to read about microprogram at this day and age?

Sample question-7
Is it still useful for me to learn assembly language these days? If yes, then what must I learn? Is it MASM, TASM, or are there other useful ones?

Sir, i have tried reading the following books:-


1. WILLIAM STALLINGS- COMPUTER ORGANIZATION and ARCHITECTURE
2. ZAKY, VRANESIC, HAMACHER-COMPUTER ORGANIZATION
3. HENNESSY AND PATTERSON- COMPUTER ARCHITECTURE and DESIGN- A HARDWARE-SOFTWARE INTERFACE


4. JOHN P. HAYES- COMPUTER ARCHITECTURE AND ORGANIZATION
5. VINCENT HEURING and JORDAN- COMPUTER ARCHITECTURE / ORGANIZATION
6. ANDREW S. TANENBAUM- STRUCTURED COMPUTER ORGANIZATION

and many more books. I also read wikipedia.
But till now i cannot comprehend what takes exactly place inside the machine. I can understand that we use gates, flip flops, registers, counters etc.
It is like this- I understand that the potato is getting gets cooked when placed in the water that gets boiled for some time. But what happens to that potato during that time of boiling? Why does it become soft? The potato was hard initially. How does the change takes place? Why does the skin peel off? Why does the taste change? Now after being cooked, it is tasty- but a raw potato is not. So what happens to those chemicals inside the potato?
So I understand that the accumulator gets incremented e.g. from zero to 100. The final "picture" is OK. What is the "boiling process" that goes on while the program is being executed? What does the "boiling" do to the text file before it finally gets stored? What happens before I get that yuummy potato or suuccessfully-run program?
Thanks, for your help
 
There are two parts to what you are talking about - hardware and software. Can't comment on hardware. So let me try to talk about software. If you already know about it, ignore it.

When a processor architecture is designed, an instruction set is defined for the same that can be used by software programmers (mainly BIOS and OS writers) to control the processor. This is known as machine language. Now machine language is pure numbers. You will have to use something like 74 / 0f 84 and 75 / 0f 85 (all in hexadecimal) if you want to write the jump if equal and jump if not equal instructions, short and long. If you have the patience and knowledge, you CAN write a working program using a simple hex editor and plain machine language. But it is error prone and a mundane task.

To make things simple, assembly language and the assembler were invented. Assembly language is nothing but a series of mnemonics used in place of the numeric instructions, and an assembler simply converts these mnemonics to the underlying instruction set, like je for 74 / 0f 84 and jne for 75 / 0f 85. It is a trivial task (again, if you have the time and knowledge) to write an assembler for a particular instruction set. In the list that I have provided above, the Intel Software Developer's Manual (similar set is also available from AMD) provides the complete mnemonics and underlying instruction set for the x86/x64 series of processors.

Now, it is not that only BIOS manufacturers and OS writers use assembly. While most operating systems are written in a higher language such as C and an extremely tiny part of it is written in assembler for reasons such as performance, access to certain processor/ architecture specific instructions and code that involves bootstrapping (since I don't write or read OS related code, can't say I am 100% correct, but I don't think I am wrong here), that is not the only place assembly language is used. Performance optimization is extremely necessary in many softwares including compilers, games, software such as MS Word and Excel, and assembly is used in such software when the code produced by the optimizing compiler needs to be improved upon.

Coming back to assembly language, it is a productivity killer, difficult to maintain or scale up to bigger projects, and it ties you up to a single architecture. To simplify things, higher languages were introduced and the compiler was invented. Talking primarily about compiled languages like C, C statements cannot be executed by the processor because it does not understand them.
Whenever you write a program in C and invoke a compiler on the source files, first a macro processor first analyzes the files and takes care of the conditional defines and things like that. This processed file is then compiled (by the compiler) into object code . And a linker (which also works on the output of the assembler mentioned a couple of paragraphs above) takes all the output, 'fixes' that part of the object code that is related to addresses in memory and finally produces the executable in the format that it supports.

Now let me try to tackle your questions -
1. Assume that you are typing something in a text box within an application running on Windows. Whenever you type something, the keyboard raises an 'interrupt' which makes the processor stop and handle the interrupt. How it is handled varies and you (and I - but I am no longer interested in that stuff) have to study things like Interrupt Service Routines (ISR), but its end product is that the Operating System gets control and it stores the value of the key pressed in the memory buffer related to the text box. If you understand or have programmed using pure Win32 API, you will know that every window has a procedure related to its class which receives messages such as key press, key down etc. These messages too are nothing but a consequence of the OS handling that event.

Now nothing 'gets converted to zeros and ones'. Within memory, it is a series of cells with states that are either on or off. We 'interpret' it as zero or one. Since long binary strings are a pain in the a**, we generally use the hexadecimal system to represent most numbers. When you see an 'A' on screen, what has actually happened is that the OS has noted that a particular byte (making things simpler by not introducing Unicode and things like that) in memory has the number 0x41 stored in it. So the OS draws A on the screen using the font that the text box is currently using.

Exhaustive list of interrupts - Ralf Brown's Files

2. Storing anything in memory takes place the same way. Every application asks the OS to provide it with memory and stores its data in such memory.

3. As explained previously, the processor does not understand C or any such thing. It only understands machine language and EVERY higher level language has to be converted to machine language before it can be executed by the processor.

4. Haven't read Mano (had a copy a long time back though). The processor executes the instruction (in fact every new generation CISC processor can execute multiple instructions in parallel, but the compiler has to be extremely clever in generating code. If it uses inefficient branching, the entire processor pipeline has to be flushed and that is a waste of precious processing time) and changes the state of flags after the execution is complete. The Intel manual has detailed information on which instruction affects which flag in what manner.

5 and 6. Can't talk about them. No experience. But you can try this - Microcode - Wikipedia, the free encyclopedia.

7. Learning assembly language helps you understand and appreciate how software really works under the hood. The best way to learn it is by writing programs in it (using the tools and links mentioned above). And learning reverse engineering helps understand how people implement certain algorithms and how code functions post linking. OpenRCE is a good 'legal' source. But there are a lot of websites that teach the 'dark arts'. No harm in learning them. But I can't mention any sources. In fact nothing beats the reverse engineering + assembly language combination. When you pit your brains against a security system and you win, it is a different feeling altogether.

As for employment opportunities, compiler construction teams, embedded software developers, antivirus companies, software security solution companies etc will appreciate people with a strong grasp of assembly language, C language and OS fundamentals and APIs.

MASM is a good one but is Windows only. TASM is quite old. Two languages that are popular are Randall Hyde's HLASM HLA and the open source NASM. IDE's and plenty of learning material are available for all of them. Instead of books I would suggest you rely on the internet for information. Nothing beats it.
 
I read your explanation- very simple way of explaining things. Ok now i get some idea. So maybe I should go ahead and take up assembly language and what see can be done for compiler construction or embedded systems.What i find interesting is "embedded systems". Can you suggest me how i go about it?
 
Embedded systems is a huge field that covers everything from consumer electronics and mobile phones to health care and aircraft systems. You will have to have some kind of qualification either in electronics or computer engineering to get into the field. Once you have that, THEN you need to have thorough knowledge of things such as assembly language, algorithms and so on. Then there is the matter of whether you want to work with hardware or software.

Some institutes do conduct courses on embedded software programming. But I am not sure how it works. You will have to search for the same or ask around. Further, remember that assembly language is architecture specific. One of the most prevalent architectures used in the world is ARM which is a RISC processor architecture. And linux-based kernels have started becoming more common. Books are available in the market on ARM and embedded software programming. But the good ones are not that cheap. No point buying them without knowing what it is all about.

If you are learning for employments sake, the college/ institute road is the only way. If you are doing it just for understanding purpose, then there are no restrictions, are there?
 
in india there aren't good, recognised & well known embedded systems courses. thats why people usually do engg. in electronics or so & go for masters out of india, generally US
 

Top