So, I have been taking diligent notes on Assembly Language from the explainer/e-book called Some Assembly Required on Github recently, and I got to a point where I came to realize that a LOT of it is math-centric. Which, "obviously, duh!" for some, but I wasn't totally aware of that going into the document.
Brief run-down of what AL is (in my Noob Speak):
It ( Assembly Language) is the "human" part of code entry that gets put through a "translator" (an Assembler) that converts the human-readable (and write-able) code into machine code (just numbers, and groupings OF them) that then tells the CPU what to do with that code via the binary number scheme (1's and 0's).
It's about as best as I can explain the tenants of it.
So, knowing that AL is primarily number focused, I went ahead and got through many sections of Some Assembly Required so I could understand the examples given, and ideally continue on with the studying/learning of AL.
But, the examples provided (though the commands/prompts/etc) were in fact legible and understandable (for/by me), I wasn't super thrilled with the fact that the equations/commands were (in the examples provided, though just simple low-level addition) involved different "arrangements" (is a word I will use) to achieve X result that a CPU can read.
It seems high-minded in the explanation I just wrote, but what I am saying is: I didn't like the fact that I had to become familiar with (and of course understand) the different "arrangements" and methods for AL. Of course, typing "1 + 1" isn't the way AL works, and it isn't the way registers and CPU's work (with data), so I am not foolishly vomiting a complaint. I am saying this is a subject matter that I do not wish to pursue at this stage of my life. :)
So what of it?
Well, nothing at this point. I will continue on with life in the world, and not concern myself with project.
All is good