Understanding the levels of computing

I have been working mostly with Java and Python on the application layer and I have only a vague understanding of operating systems and hardware. I want to understand much more about the lower levels of computing. At university I took a class about microprogramming, i.e. how processors get hard-wired to implement the ASM codes. I always thought I wouldn't get more done if learned more about the low level. I wonder how it is possible that hardware gets hidden almost completely from the developer, and if the operating system a software layer for the hardware. In programming I have never come across the need to understand what L2 or L3 Cache is. For the typical business application environment one almost never needs to understand assembler and the lower levels of computing, because nowadays there is a technology stack for almost anything. I guess the whole point of these lower levels is to provide an interface to higher levels. On the other hand I wonder how much influence the lower levels can have; for example graphics computing. On the other hand, a theoretical computer science branch works on abstract computing models. However, I rarely encountered situations where I found it helpful thinking in the categories of complexity models, proof verification, etc. I know that there is a complexity class called NP, and that its members are kind of impossible to solve for a big N. I'm missing a reference for a framework to think about these things. There all kinds of different camps, who rarely interact. I have been reading about security issues. Here many different layers come together. Attacks and exploits almost always occur on the lower level, so in this case it is necessary to learn about the details of the OSI layers, the inner workings of an OS, etc.

May 14, 2025 - 13:12
 0

I have been working mostly with Java and Python on the application layer and I have only a vague understanding of operating systems and hardware. I want to understand much more about the lower levels of computing. At university I took a class about microprogramming, i.e. how processors get hard-wired to implement the ASM codes. I always thought I wouldn't get more done if learned more about the low level.

I wonder how it is possible that hardware gets hidden almost completely from the developer, and if the operating system a software layer for the hardware. In programming I have never come across the need to understand what L2 or L3 Cache is. For the typical business application environment one almost never needs to understand assembler and the lower levels of computing, because nowadays there is a technology stack for almost anything. I guess the whole point of these lower levels is to provide an interface to higher levels. On the other hand I wonder how much influence the lower levels can have; for example graphics computing.

On the other hand, a theoretical computer science branch works on abstract computing models. However, I rarely encountered situations where I found it helpful thinking in the categories of complexity models, proof verification, etc. I know that there is a complexity class called NP, and that its members are kind of impossible to solve for a big N. I'm missing a reference for a framework to think about these things. There all kinds of different camps, who rarely interact.

I have been reading about security issues. Here many different layers come together. Attacks and exploits almost always occur on the lower level, so in this case it is necessary to learn about the details of the OSI layers, the inner workings of an OS, etc.