Computer Science Homework Help

CPT 304 Ashford University Week 2 Operating Systems Discussion

 

In your responses, you will evaluate your peers’ work from this week and provide suggestions or comments about how they could improve Section 2 of their concept maps before final submission. Describe any organizational elements that surprised you. Identify any pieces of information that you felt were missing or unclear. Propose at least one recommendation about how each peer can refine the organization or clarify the concepts of threads and process synchronization

Response 1

Kendric Garmon

May 27, 2021 at 2:34 PM

CPT304 Wk 2 Interactive Assignment.png

A process is a program in execution and changes states during execution. Those process states are New, Running, Waiting, Ready & Terminated. A process control block is a data structure that stores temporary data relevant to each process state including:

  • Program counter
  • CPU Registers
  • CPU-Scheduling Info
  • Memory-Management Info

A thread is a flow of control within a process and is comprised of a thread ID, program counter, register set, and a stack. Multi-threaded processes allow for increased performance by performing more than one process at a time. In a single-threaded model, a single process utilizes all of its resources.

The critical-section problem refers to the fact that processes running in parallel cannot be allowed to access shared resources simultaneously.

Response 2

Jason Miller

May 27, 2021 at 5:08 PM

My Concept Map:

WK2.CM (1).jpg

Process: Process is an execution stream in the context of a particular process state. By Execution Stream, we mean a sequence of instructions executed sequentially, i.e., only one thing happens at a time.

Process State: Each and every process has a state associated with it at a specific time. States can be “ready”, “waiting”, “running”, “terminated”, etc.

Process Control Block: Keeps track of processes that are swapped in and out of memory and additional information that must be stored and restored.

Single Thread vs. Multi-thread: Single-thread is where an entire process executes from start to finish without being interrupted. In multi-threading, threads execute simultaneously but share resources with each other. Depending on the application, multi-threading can offer a significant performance boost over single-thread execution but can be more time-consuming and complicated to program and troubleshoot errors.

Critical Section Problem:

The critical section is an area of code where processes share variables with one another. The key part of this area is that only one process can execute within its critical area at a time. All other processes must wait for their turn to operate their critical section segments to prevent a Race condition. According to Tutorials Point (n.d.), a Race condition happens when “the result of multiple thread execution in critical section differs according to the order in which the threads execute”(para. 1). In order to solve the critical section problem, processes must have mutual exclusion, and any process can enter a critical section if it is free, and each process’s wait time should be limited.

From my research, one solution to the critical section problem is Peterson’s solution. It is able to achieve mutual exclusion and satisfy the progress and bounded waiting requirements.