I second this course. It's awesome. The next step from there is probably Coursera's VLSI course, starting with https://www.coursera.org/learn/vlsi-cad-logic . It's all about how real-world VLSI CADs work.
You just described a few different sub-fields of computer engineering:
1. Processes, which involves lots of materials science, chemistry, and low-level physics. This involves the manufacturing process, as well as the low-level work of designing individual transistors. This is a huge field.
2. Electrical circuits. These engineers use specifications given to you by the foundry (transistor sizes, electrical conductance, etc.) and using them to create circuit schematics and physically laying out the chip in CAD. Once you finish, you send the CAD file to the group from #1 to be manufactured. Modern digital designs have so many transistors that they have to be laid out algorithmically, so engineers spend lots of time creating layout algorithms (called VLSI).
3. Digital design. This encompasses writing SystemVerilog/VHDL to specify registers, ALUs, memory, pipelining etc. and simulating it to make sure it is correct. They turn the dumb circuit elements into smart machines.
It's worth noting that each of the groups primarily deals with the others through abstractions (Group 1 sends a list of specifications to group 2, Group 3 is given a maximum chip area / clock frequency by group 2), so it is possible to learn them fairly independently. Even professionals tend to have pretty shallow knowledges of the other steps of the process since the field is so huge.
I'm not super experienced with process design, so I'll defer to others in this thread for learning tips.
To get started in #2, the definitive book is the Art of Electronics by Horowitz & Hill. Can't recommend it enough, and most EEs have a copy on their desk. It's also a great beginner's book. You can learn a lot by experimenting with discrete components, and a decent home lab setup will cost you $100. Sparkfun/Adafruit are also great resources. For VLSI, I'd recommend this coursera course: https://www.coursera.org/learn/vlsi-cad-logic
To learn #3, the best way is to get a FPGA and start implementing increasingly complicated designs, e.g. basic logic gates --> counters --> hardware implementation of arcade games. This one from Adafruit is good to start: https://www.adafruit.com/product/451?gclid=EAIaIQobChMIhKPax... , though if you want to make games you'll need to pick up one with a VGA port.
Silicon design & manufacturing is super complicated, and I still think that it's pretty amazing that we're able to pull it off. Good luck with learning!
(Source: TA'd a verilog class in college, now work as an electrical engineer)
I'd start with learning a hardware description language and describing some hardware. Get started with Verilog itself. I'm a fan of the [Embedded Micro tutorials]( https://embeddedmicro.com/tutorials/mojo ) -- see the links under Verilog Tutorials on the left (they're also building their own HDL, which unless you own a Mojo board isn't likely of interest). Install Icarus Verilog and run through the tutorials making sure you can build things that compile. Once you get to test benches, install Gtkwave and look at how your hardware behaves over time.
You can think of "IP cores" as bundled up (often encrypted/obfuscated) chunks of Verilog or VHDL that you can license/purchase. Modern tools for FPGAs and ASICs allow integrating these (often visually) by tying wires together -- in practice you can typically also just write some Verilog to do this (this will be obvious if you play around with an HDL enough to get to modular design).
Just writing and simulating some Verilog doesn't really give you an appreciation for hardware, though, particularly as Verilog can be not-particularly-neatly divided into things that can be synthesized and things that can't, which means it's possible to write Verilog that (seems to) simulate just fine but gets optimized away into nothing when you try to put it on an FPGA (usually because you got some reset or clocking condition wrong, in my experience). For this I recommend buying an FPGA board and playing with it. There are several cheap options out there -- I'm a fan of the [Arty]( http://store.digilentinc.com/arty-a7-artix-7-fpga-developmen... ) series from Digilent. These will let you play with non-trivial designs (including small processors), and they've got lots of peripherals, roughly Arduino-style.
If you get that far, you'll have discovered that's a lot of tooling, and the tooling has a lot of options, and there's a lot that it does during synthesis and implementation that's not at all obvious. Googling around for each of the phases in the log file helps a lot here, but given what your stated interest is, you might be interested in the [VLSI: Logic to Layout]( https://www.coursera.org/learn/vlsi-cad-logic ) course series on Coursera. This talks about all of the logic analysis/optimization those tools are doing, and then in the second course discusses how that translates into laying out actual hardware.
Once you've covered that ground it becomes a lot easier to talk about FPGAs versus ASICs and what does/doesn't apply to each of them (FPGAs are more like EEPROM arrays than gate arrays, and for standard-cell approaches, ASICs look suspiciously like typesetting with gates you'd recognize from an undergrad intro-ECE class and then figuring out how to wire all of the right inputs to all of the right outputs).
Worth noting: getting into ASICs as a hobby is prohibitively expensive. The tooling that most foundries require starts in the tens-of-thousands-per-seat range and goes up from there (although if anyone knows a fab that will accept netlists generated by qflow I'd love to find out about it). An actual prototype ASIC run once you've gotten to packaging, etc. will be in the thousands to tens of thousands at large (>120nm) process sizes.
It is and electrical engineering students make them pretty regularly, it's just much more expensive and complicated if you actually want to make a chip with the output of one instead of just simulating it.