Tuesday, November 11, 2025

Week 25

This week helped me understand more clearly how an operating system keeps everything running at once, even when there’s only one CPU. We talked about how a process is just a running program with its own memory, registers, and instructions, and how the system switches between them so quickly that it feels like they’re all running at the same time. I found that idea really interesting because it shows how much of what we see on a computer is an illusion created by clever design.

Another big topic was limited direct execution. I learned that this is how the operating system lets programs run directly on the hardware while still protecting the system. The computer has to make sure that one process can’t mess with another or with the operating system itself, so it uses things like kernel mode and timer interrupts. I hadn’t realized before that even the operating system pauses while other processes run — that really surprised me.

We also studied different scheduling algorithms and how they decide which process gets the CPU next. I practiced calculating turnaround time and response time, which helped me see how different approaches change performance. For example, Shortest Job First focuses on finishing tasks quickly, while Round Robin makes sure every process gets a fair share of time. The multilevel feedback queue was the hardest to wrap my head around at first, but once I understood that short jobs finish quickly and long ones get bumped down in priority, it clicked.

My “aha” moment this week was realizing how much thought goes into something as simple as keeping programs running smoothly. I’m still curious how these scheduling ideas work on modern computers with multiple cores, but I feel like I’m starting to see how all the pieces fit together. 

No comments:

Post a Comment

Week 28

This week’s focus on concurrency and threads felt like a big shift from everything we have done so far. Until now, processes always felt sim...