How To Program From Ground Up Save

Support document for my video series about learning to program computers from the ground up.

Project README

How to Program From The Ground Up (with Minimal BS)

©2024 Chris Athanas

A follow-along guide for anyone who wants a solid understanding of software development as a semi-historical walk through the essential parts of computing leading to present day software development paradigms.

This is not a thorough deep dive into any one topic, but a broad overview of the core concepts and principles that are missing from most programming tutorials and courses. My goal is to give you the context about why things are the way they are in computing, and how they came to be that way, and the issues and problems that arose from things being that way.

I have found that it's far more important to understand the process that people went through to create the current solutions than to understand the mechanical details of how the solution work. Many details about the disadvantages and limitations are often left out technical discussions, or worse, simply not known or dismissed as irrelevant due to ignorance.

$\textcolor{yellow}{Please\ consider\ giving\ me\ a\ STAR\ as\ THANKS!\ ⭐️ 🤩}$

This document is a reference and follow along guide for my video series on YouTube:

Check out the discussion group: https://twitter.com/i/communities/1759753866219040980

Table of Contents

Introduction

  • This is a "how-to" guide for anyone interested in creating software who needs an overview of techniques and concepts used, from the fundamentals of physical logic representation to high-level programming languages and the "how" and "why" of the various paradigms and methodologies used in software development.

  • My Goal is to Have You:

    • Understand more the why and how of programming, not just the what and mechanical explanations.

      • This is my curated list of information to take you on a realistic and grounded journey of understanding the essential part of computing to create effective software.
      • There will be some technical details, but only enough to understand the fundamental principles, not to be an expert.
      • I cover the areas that I had difficulty understanding when I was learning to program.
      • This is more-or-less a historical walk through WHY things are the way they are in computing, and HOW they came to be that way.

      I have since discovered that the majority of my confusion derived from how things were presented to me. Instructors often completely misunderstood correct application and limits of metaphors and lacked real-world experience of the things they were teaching. Many just repeated what they were taught without understanding. When challenged about their knowledge, they would often become defensive and dismissive, and sometimes even hostile.

      Many of the concepts are, in retrospect, full of needless jargon and unnecessary complexity. I now understand that the complexity was often used to make the instructor seem more knowledgeable and to make the subject seem more difficult than it actually is.

      I would like to say at the outset that there is a TREMENDOUS number of technical-sounding words with all referring to the same basic core ideas. I will do my best to be as consistent as possible, and also work to point out the multiple definitions and reduce the jargon and clarify the core meanings and ideas.

The Essence of Computing

  • People used to do all computing by hand, and now we use various machines to do the same thing, in a much faster and more reliable way.
    • When we use a machine to do computing, we are just using the machine to represent the problem and the solution in a different way.
    • The machine knows nothing about the problem. it's only following orders created by clever humans using boolean logic to represent the problem and its sequence of actions to solve it.
    • Each operation in the computer was once done by teams of people working in groups, logically delineated in nearly the same way as the computer's components are arranged.
      • There were specialized roles for each person, for example, the "storage" would be a set of filing cabinets and a clerk to store and retrieve them.
        • This is now done by the "hard drive" and the "file system" in the computer.
      • The arithmetic would be done by a person called a "calculator" who would perform the operations and record the results.
        • This is now done by the "Arithmetic Logic Unit" and stored in the "Registers" in the computer.

SOME IMPORTANT ITEMS TO KEEP IN MIND

  • THERE IS NO MAGIC IN COMPUTING, ONLY HUMAN CLEVERNESS, HUMAN SYSTEMIC THINKING AND HUMAN INGENUITY USED TO SOLVE HUMAN PROBLEMS.

    • If you hear anyone say "it's magic" or "it's a black box" or "it's kind of like a person," they are:
      1. Being lazy,
      2. Or indicating it's not relevant at the moment,
      3. Or (USUALLY) they don't understand the problem or the solution enough to explain it and become hand-wavy.
  • ITS ALWAYS JUST HUMAN CLEVERNESS AND INGENUITY, THE IDEA OF REPRESENTING ONE THING AS ANOTHER, NOTHING MORE.

    • The machines CAN NEVER UNDERSTAND the problem or the solution in the way humans conceive of the problem.
    • These machines are only following the logical operations that humans have carefully designed to represent the problem and a solution "space." There is no inherent "understanding" in the machine, and can never be.
    • The only way the machine would ever know the full human context of the problem (and the solution) is if the machine ACTUALLY was a human, and then it would be a human, and not a machine.
    • Mistaking the machine for having intelligence is known as "The Eliza Effect" and is a common mistake made by people who don't understand the limits of the machine's capabilities.

How To Install and Run The Samples In This Guide

Open Source Agenda is not affiliated with "How To Program From Ground Up" Project. README Source: realityexpander/How_to_program_from_ground_up

Open Source Agenda Badge

Open Source Agenda Rating