[ art / civ / cult / cyb / diy / drg / feels / layer / lit / λ / q / r / sci / sec / tech / w / zzz ] archive provided by lainchan.jp

lainchan archive - /λ/ - 6264

File: 1432710145070.png (2.08 MB, 300x233, 1432709920847.gif)


I've been meaning to make this thread for quite a while now.

Let's discuss the programming languages that are further down the abstraction ladder than the heavily abstracted languages popular today.

Most importantly, let's discuss languages and programs written in them that don't require an operating system to function or even computation expressed as in my image.

So what do you like about them, Lain? The assembler languages, Forth, among other languages and all that, are all interesting and welcome discussion points.

What I prize most about all of this is how you gain a real understanding of what modern computation is from it all. So many people program who don't know how these things actually work. You can't be good at anything without an understanding of what that thing actually is. So many people also don't have any variety in what they mess around with.


i was just looking information on this topic
im just learning about low level programming and i need some books or a basis to start
some lainer help me

also what is the programming language in the pic


Looks like LogiSim or something similar, it's just a circuit simulation program.

Can't help with the books m8, the one I read was written by the teacher of my class and they change a lot depending on the processor.


I forgot to put that in the OP. We should collect resources on this subject as well.

I'd recommend that you read this first:


I've been writing some ARM assembly for an embedded micro lately (LPC 13xx). I find it quite refreshing, actually. The goal of the software if quite simple so it's not a case of me getting in over my head, and being able to control every instruction does make for a very straightforward work flow.


A great place to start would be the book "The Art of Assembly Language" by Randall Hyde. It's a really solid book and has everything you need to know for getting started with assembly.



It seems like all embedded programming is done in C and asm. Id like to try embedded programming with something like Fortran. I dont think you need pointers to do systems programming.


Man I had no clue Logism could be that powerful. All I've ever done was a basic ALU…


I've heard some bad things about this book—mostly that it focuses way too much on HLA.


Programming from the Ground Up is a free book teaching assembly on GNU/Linux:

> also what is the programming language in the pic

It's a 4 bit microprocessor made by someone on /g/ using LogiSim:


Tried making something in Logisim a while ago with reversible logic stuff, but somehow forgot how to get tot he combinatorial analysis window or something and wasn't able to readily turn my lower level parts into something less memory-intensive. Might get back to that later, but I'd have to either start over or get it off my old computer.

Much more recently, started trying to learn Pure Data, which is made to work much like audio patch systems and thus could probably have its patches implemented in a low-level manner. Reversible logic might work out nicely here too, but what I'd really like to try here is see if I can use it for neural networks.


File: 1438563142471.png (3.1 KB, 200x200, domino03.gif)

How low do you want to go?
Computer made from dominos!

I know not a langueage but its nice stuff and thought you guys might appriciate it!
(ok so no ram and can only do single calculation every setup... but its very cool)


The indervidual making of and/or/xor gates:


if we are talking about gamey things, on >>>/rpg/1983 there is a thread about TIS-1000. It's a game about programming a strange computer with multiple CPUs in assembly. It's pretty good, even if it's not the real thing.


File: 1439568835864.png (73.46 KB, 200x200, 81bx0GJBVBL-600x600.jpg)

check the books from these two guys, they have really great books, and if you like to program in that border between hardware and software


People who liked this thread also liked: nand2tetris.org

I enjoy a distributed virtual processing environment with other low level hackers here: skullcode.com
Such anarchy is too chaotic for most, even among machine level coders, but its exciting to know one mistake and my subprocess can be exploited. Frequent snapshots makes getting cracked fun instead of painful.

Limiting oneself to existing architectures is not progressive, IMO. For instance, on most processors the instruction pointer is protected -- you can only modify it via opcodes, directly writing to it can corrupt the execution flow. Skullcode's machine takes this a step further and has a separate call stack for return instruction pointers. Only root level (ring 0) code can manipulate the call stack directly, user code can only affect it via opcodes (eg: call / ret). Separate code and data stacks eliminates all stack smashing exploits. There is nothing a function can do to modify the caller's stack reference frame since the return operation always restores the parameter base pointer to its prior value (along with the instruction pointer).

On the skullcode machine stacks grow UP rather than down. Since the stack grows UP the size allowed for that function is not limited to how much it was allocated upon call, there's no need for a special "variadic" distinction, as all functions can take as many parameters as there is remaining system memory. Upon stack overflow the kernel can grow a program's stack and/or relocate it. Since data is accessed by offset from its base "segment" pointer this means you don't have to go trace down and modify every "pointer" to grow the stack. This also makes language level G.C. implementations insanely fast (and largely unnecessary). No more need to implement recursion on the heap. Note: calling heap based recursion "iteration" is wrong, IMO. One has just implemented a separate stack for parameter data which grows UP instead of down, which should have existed all along...

In other words: Study the existing chipsets, yes, but just remember that they are not the be all end all to chipset design. E.g. stack smashing exploits needn’t have existed after modern ARM or 64 bit chipsets came out, but the hardware designs embrace old programming techniques. Any OS that implements stacks differently than hardware intends suffers performance penalties due to moronic hardware design decisions.


Would you link me to some reading material concerning skullcode? I would be very grateful.

>Study the existing chipsets, yes, but just remember that they are not the be all end all to chipset design.

I completely agree with this. Modern machines and the systems built on them are based on so much backwards compatibility and mistakes. It's nice to see new and innovative machines.

Every assembler I look at is the same thing, complexity and backwards compatibility. The only solution I've found is to start thinking about what I want in an assembler and to make that a reality.


What about something like RISC-V? It's not backwards compatible with anything, by design. At least not yet.


>You can't be good at anything without an understanding of what that thing actually is.
Hardware is a different thing from software.

I do think hardware is hella cool, I'm just sick of the pretentiousness of hardware fanatics. Hell, why stop at hardware, why not go down to the level of quarks, or lower?


>Let's discuss the programming languages that are further down the abstraction ladder than the heavily abstracted languages popular today.


The thing about RISC-V is that it's RISC.

It's designed to be used by compilers instead of people.

I wrote a post in /tech/ that goes into more detail about my opinions on this:


File: 1440257007923.png (12.46 MB, 200x200, Z80.pdf)

I feel attracted to this metal level programming, but the whole Electronics world is totally alien to me, and I haven't yet the patience to start reading a good book on the subject.
Anyway, I wanted to share this with me fellow lains, it's the book I want to read after I get the ground basics
I hope you like it
Build your own Z80 Computer


File: 1440258197568.png (10.32 KB, 200x183, rtab_wireworld.gif)

This is some awesome soykaf. Much appreciated anon.
I'm going to actually try the binary adder, not with dominoes but with WireWorld


>Build your own Z80 Computer
Have you tried reading CODE The Hidden Language of Computer Hardware and Software.Its basically starting from basic gates to building a full blown computer.


sounds a bit like "the elements of computing systems"



Anybody writing Go/Rust? Latest Go release has pauseless garbage collection + Rust is a joy to write


File: 1442951696989.png (19.68 KB, 200x200, thanksoface.jpeg)

mfw thats muh gif


I'm taking a class that is introducing VHDL. Does that count?


Wrong thread?
Serioulsy, this is like, opposite to what this thread is about.


File: 1443123876893.png (79.41 KB, 200x175, strobe.png)


Very much so, not only does it teach you about building a processor from scratch, but also on how to deal with signals down to the delta cycle.

It really hits the spot of OPs statement:
>What I prize most about all of this is how you gain a real understanding of what modern computation is from it all.

Ultimately it's just a blueprint for an ASIC, can't get any more bare metal than that.

I'll begin: What standard is currently taught in classes? '93 was the one I learned, maybe now they will introduce the some newer features. C++0x would be a prime example for such things.

On a related note, I wish everyone would hurry up and catch up with VHDL-2008 because those generic types are rather useful.
Of course that will take at least 5-10 years at this rate, like every other common language in the field tried and true is king.


File: 1443125991169.png (2.42 MB, 200x150, sicp-computer-science.gif)

>So many people program who don't know how these things actually work. You can't be good at anything without an understanding of what that thing actually is.

The essence of computing has nothing to do with the physical hardware we use to do our computations. Learning about assembly languages and digital circuits is interesting in its own way, but the above logic is very flawed.


>The essence of computing has nothing to do with the physical hardware we use to do our computations.
True, for the most part.

>Learning about assembly languages and digital circuits is interesting in its own way, but the above logic is very flawed.

It is perfectly sound in a world where abstractions regularly break.

The lowest level accessible in a computer system, the bedrock abstraction, will be what users need to understand, because what is above that will be broken or circumvented for various reasons. As it stands, the overwhelming majority of computer systems don't have very high-level bedrock abstraction.

It would be nice if the majority of computer systems were only programmable in an more abstracted language, but the market overlords have decided that isn't what we need.


Computer Engineer here, can confirm Petzold's book is an excellent introduction to the field


Sophomore CS student here. I'm reading it because I've heard great things about it, and I'm in love with the style. Very easy to understand without sacrificing much in the way of complexity.


One idea that's been bouncing around my head is to make an OS that is essentially a lisp interpreter. Maybe it could use the linux kernel, maybe not, but I'd want everything to be a) abstracted away from real hardware and b) reliable.

It might also be a cool idea to strip it down even more and make an OS with no kernel, just a lisp interpreter and some barebones hardware virtualization. You'd need a pretty robust lisp dialect(racket comes to mind) but it would be a cool idea for a research OS.


Are you me?
Maybe we couldvwork together, what do you say?
Pls respond


I started building a mock lisp OS yesterday.
>neurosuggesting I'm not being spied on by kalyx's brigade of cyber warriors so they can pretend to have the same interests as me


sure, if I find the time. Tox?



I'm not him, but he didn't mean a mock lisp. He probably meant Mocklisp, which was the programming language of Gosling Emacs.

Grep this page for 'mocklisp' and you'll be able to read more:




Ive been thinking about something like this. Do you/anybody with knowledge on the topic think that it would be a sustainable design


which, the first or second idea? The first would definitely be useful, the second... I dunno.


The first i think. I guess it would be similar to how google uses java for android if you were to use the linux kernel.


The idea of writing a custom kernel for it does however sound quite interesting.


How good is "Programming from the ground up"? I've started reading it and it seems quite nice but i want to know if I'm wasting my time.


Do you already know a fair amount about assembler language?
If you do you may be wasting your time, but it's still a good read.
It's not a long book anyway. You should just finish it. It can't hurt you.


A custom kernel in lisp, no less.


I would say a custom kernel in lisp would make it less accessible than one in c for example. I think having a kernel built in something usually used for low level development would make more sense. C allows for more easy cross-architecture compilation and has high performance compilers compared with essentially anything


This has already been done:

It doesn't feel right to talk about Lisp so much in the low level programming thread, mostly because Lisp already has a thread.

What assembler do you all use? What architectures do you know? I'm getting better with MIPS and GAS, mostly because there's an apparent lack of advanced MIPS assemblers like you would see for other architectures, like FASM. I'm also looking into Gforth's MIPS assembler.

It's interesting to see how a postfix assembler works. It only further helps solidify the difference between the assembler language, the syntax, and the instruction set itself, the semantics.

I'll repost this because it's appropriate for the thread:

opcode: Denotes the type of instruction.
RRRRR, rrrrr: Denotes the registers used by the instruction for sources and the destination.
shamt: The shift amount, used for some instructions.
functi: Denotes the function being used for register instructions, such as addition or a right-shift.
x: Denotes a 16-bit immediate value.
X: Denotes a 26-bit address.

Isn't it beautiful? The regularity both makes it easier to learn and also makes developing MIPS tools much simpler than for a complex architecture such as x86. I suppose that makes it ironic that MIPS tools are so easily outnumbered by x86.


The adjective 'mock' was meant to describe the OS, not the Lisp. My OS is just a REPL that works like a shell. Different users have access to different namespaces and different file permissions. Initial namespace is the 'login' namespace which only exposes #'login and a few other things. That's the plan anyway.


Not fiz-bin?



I usually use whatever comes with my compiler for an assembler. I use Keil's assemblers for 8051 and ARM. They have pretty nice macro support. I use fasm for x86 but I don't write much x86 asm.

I can certainly appreciate the regularity of MIPS but I actually find more complicated ISAs that pack more instructions into less bytes more asthetically pleasing. Given the importance of instruction cache on the high end and code size on the low end I see a dense and more complicated ISA as one more way to eke every last bit of performance out of a chip.


File: 1447229977749.png (4.43 MB, 200x150, CP_M Assembler 'Hello World!' - Coleco Adam-QPEegBgi66Q.webm)

Does anyone else try to find information on how old assemblers operated?

It seems valuable to know what assemblers were like when a great many programmers actually used them and it seems like good inspiration if you ever want to design your own, which I do.

Modern assemblers feel strange when you try to do much work with them directly.


assemblers are *very* simple. check out some of the assemblers for the DCPU-16, that's probably close to how they used to be.


What is a comfy ISA to learn assembly with? I don't really want to learn x86, I'm considering ARM or RISC-V, but am more than happy to run anything in a simulator and learn assembler for whichever platform if it means avoiding lots of historical baggage and backwards compatibility etc.

Do you know if MIPS is a lot cleaner than ARM?


>Do you know if MIPS is a lot cleaner than ARM?
From what I've seen, it appears to be.
ARM has two versions of THUMB, which is meant to result in more compact instructions.
As far as I know, MIPS is extremely regular.
When it comes to MIPS assemblers, you're pretty much stuck with GAS though. It makes everything look gross.

MIPS and RISC-V are designed for compiler output over people actually writing in assembler.

If I had to recommend an ISA to you, I would recommend MIPS.



Gas supports intel syntax, or was your criticism more to do with gas macro processing?


>intel syntax

I just like the warm feeling of an assembler that is specially tailored to a specific platform. GAS macros are alright.


You definitely need pointers or something to access the memory directly to create an OS.


Are there any decent MIPS64 deveopment boards available?
There's the Creator from ImgTec, but that's 32bit.

I imagine since MIPS is relatively sane, knowledge from MIPS32 will carry over to 64, but, you know, I'm a fag.


I have also heard this. "High Level" Assembly is an oxymoron. Just my 2 cents but abstracting assembly is silly.


RISC-V isn't too bad to write by hand. It's nicer than x86, which is like kicking dead whales down the beach .


I have yet to find a decent MIPS64 target. ARMv7 (and -M) have a much more complete ecosystem.

Of course, my favorite ARM platform of all time is the Nintendo DS.


I prefer high-level assembly as it's slightly better (easier to read) than repeating chunks of code via macros.

Check out Octo (a CHIP8 HLA) for fun.


File: 1458945722074.png (2.36 MB, 200x150, intelpentium4northwood.jpg)

I want to get into assembly but I don't have experience with any high-level languages. I'm not much interested in them though.

So now I've a bunch of introductory texts and don't know which to read/where to start. Really, I'm dedicated, and I know that assembly is the language for me but still I'm overwhelmed. Does anyone have advice?


File: 1458947309225.png (637.25 KB, 200x138, MOS_6501_6502_Ad_Sept_1975.jpg)

i'd suggest starting with a smaller architecture, x86 is overwhelmingly complicated for the beginner.

6502 assembly isnt very hard and will teach you the basics of what you'll need for x86 assembly. hell you may even decide you like it and target retro machines (NES, c64, atari 2600, apple II, etc) instead of full blown x86.

there's even an ebook with a built in 6502 emulator. https://skilldrick.github.io/easy6502/


My hero! <3 This is perfect, especially because I've recently come across this simulator http://visual6502.org/JSSim/index.html

Infinite neato!!!


Damn, I just realized the tutorial is DIRECTLY LINKED on the simulator.


Agreeing with >>15201, Intel's architectures are far too complicated at this point for practical purposes.

You may find it instructive to build a small program with a pedagogical computer architecture, such as the Little Man Computer: http://elearning.algonquincollege.com/coursemat/dat2343/lectures.f03/12-LMC.htm

I also find programmable calculators interesting. They're usually comparatively simple, self-contained, and intended to be programmed with machine code through a simple interface.


Interesting. There's a shop in the area that's selling an old TI-30, or it may be a TI SR-40 or TI-57--- either way, nobody's touched it for the three/four months it's been hanging around so maybe I'll pick it up. I know I was attracted to it inexplicably the first time I saw it.


I'm only familiar with HP calculators, so you'll probably want to research the model in your area before buying it. Not all old calculators are programmable.

Depending on the model you buy, you may be able to program in direct machine code easily or you may have an interface that is slightly higher up, consolidating functions like factorial and square root into single actions while still allowing basic control flow and conditionals, although that's still fairly descended into the lower levels.


Yeah, I've been kind of looking into it, but I forget the model so I'll have to embarrass myself real quick in the store.


Currently reading this, very easy to understand and pretty fun to read. Way better than my professor. Quickly learning more MIPS assembly, it's really fun.


thanks for the link I've been planing on learning some assembly over the summer this looks like great starting point.


This is one of the biggest hurdles practitioners of theory must overcome: computers are in the real world with physical limitations.

I once had a Haskeller try and lecture me on Claude Shannon, Information Theory, and the Turing Machine when I asserted the fact that we use binary because computers rely on electricity. It's easy to get caught up in the academia of thoughts and theories without realizing the reason those thoughts existed was because of a real world need.

The "essence" of computer indeed has nothing to do with traditional computers in exactly the same way the "essence" of mathematics has nothing to do with the pencil! But that doesn't mean bridges, buildings, and roads are engineering theories thought up and conceived completely separately from real life -- conversely they were created because of our need for them.

The theories are created because of our need and don't exist outside of that. If for some reason the need "to compute" in our society never existed, you can be sure that branch of mathematics wouldn't either.


Anyone know any good resources for writing assemblers and/or disassemblers?


File: 1461268782768.png (27.37 KB, 200x139, zeus_ramdump.png)

This looks cute. Can I write custom blocks in verilog for it? This might be more visualy appealing in presentations then just dull waveforms and cursors.



Right on, displaying and organizing waves for larger testbenches is near impossible to make appealing.
Haven't opened logisim in years tough, but wikipedia seems to confirm support for both VHDL and Verilog.


>While users can design complete CPU implementations within Logisim, the software is designed primarily for educational use. Professionals typically design such large-scale circuits using a hardware description language such as Verilog or VHDL. Logisim is unable to accommodate analog components.

Strictly analog waveforms aren't an issue anyway I suppose.
Usually I generate a .wav if it needs to be analyzed further.

In total it's fucking great and you can make it look like a motherboard straight from the 80s, the most schway shit.


Maybe the dragon book? Otherwise it depends on the ISA you're trying to write one for


If you know Forth, the gforth distribution includes assemblers and disassemblers for quite a few platforms, including x86, MIPS, power, and others.

Just download the tarball and look in the arch/ directory.

The key aspect of an assembler is keeping an association between the names of opcodes and their values. Good instruction sets usually have a pattern that is very easy to follow, as you can see with MIPS in >>10465.


I want to thank everyone for the useful links and resources.
I have also recently delved into computing, I am set on learning more about what is going on under the hood after being somewhat frustrated with higher level languages that might be easy and "just work", I can't handle my autism, desire to know what is actually happening and my thirst for optimisation.

I am about halfway through the book 'Assembly Language step-by-step' written by Jeff Duntemann and it has been a really useful book so far, with a lot of attention to the very basics of computer operations and how the components of a computer work together.

I also spend a fair bit of time learning to count and calculate in binary and hexadecimal and I find that a basic understanding of that is also really helpful.


File: 1464066502494.png (855.02 KB, 200x147, Vaporwave Kaiba.gif)

Which assembler language does each lainon like the most and why? Honestly, I think MIPS and ARM are pretty nice. In general, I like these architectures more than x86, just due to being more manageable and not being as convoluted and bloated as x86.


z80 was fun


While I don't know many assembly languages, the ARM architecure seems nice. It is clean enough, the register set attempts to be of equal length throughout, and there is actually a defined subset of such registers. And it has a handy scheme for modifying the arguments of an instruction in place


There's no "assembler language" with the meaning you think about. They are also not (often) tied to one architecture set.

Two examples of assembler languages are NASM for x86 and AMD64, and GAS (GNU assembler) for almost everything (x86, AMD64, ARM, PowerPC, MIPS, and many others). These two examples should be enough, as in reality there are only two common ASM syntaxes: AT&T syntax and Intel syntax.
Here you can find a summary on differences http://asm.sourceforge.net/articles/linasm.html#Syntax

Instruction sets have almost nothing in common with assembly languages' syntax.
There's a reason x86 is "bloated" and, as of now, it's probably still one of the best architectures and instruction sets.
MIPS is shit, as are all RISC ISAs. ARM was meant to be a RISC initially, too, but thankfully they haven't gone this way with further revisions and it became a good competitor for x86.
Efficiency of CPUs is limited only to richness of their instruction set. While it's possible to implement complex operations in software, e.g. division, it's much better to include instruction for that in architecture, even if internally CPU translates it to sequence of simpler operations (emulates is). This way it's possible to implement such operation efficiently in future hardware, and all software will benefit from it, while working on both old and new processing units. Now compare that to handling multiple revisions of same architecture, dealing with lack of forward compatibility (which is a big deal when new version of architecture appears every year or two and major software vendors drop support for old ones quite quickly – hello ARM, Android), necessity for multiple assembler implementations.
It's not like x86's ecosystem with multiple extensions isn't affected by that, but definitely it is less severe and not so focused on planned obsolescence.


>They are also not (often) tied to one architecture set.

They almost always are, actually. If there is any difference whatsoever between two ISA's they will have different assembly languages.

>x86 is "bloated" and, as of now, it's probably still one of the best architectures and instruction sets.

IA-64 > AMD64 any day of the week

>MIPS is shit, as are all RISC ISAs.

let the holy wars commence


>They almost always are, actually.
And that's why you completely ignored following paragraph.
Assembler language syntax is not tied to architecture's instruction set, nor vice versa.
Instruction set is like a library in programming language. The keywords obviously differ, but they don't affect assembler syntax; see gas.
My point was that original question was incorrectly phrased. OP obviously meant to ask about architectures, instruction sets.

>IA-64 > AMD64 any day of the week

IA-64/Itanium is not IA-32e/EM64T/Intel 64 (Intel's implementation of x86-64). It's also pretty much dead.
I can agree that AMD64 is not the best architecture. If it's better than x86 is arguable. It definitely adds some important features (e.g. support for more than 64GB RAM, which is limit of x86 with PAE).

>let the holy wars commence

MIPS great for teaching in academia. It's only advantage is simplicity. Most vocal about it "superiority" are students and people with not much knowledge about CISC architectures, it seems.
There is no RISC architecture able to compete with x86 in field of performance. Prove me wrong.
Special purpose processing units (e.g. GPUs) do not count. While I'm all in for this idea, they are hardly able to serve the purpose of general purpose processing unit.


>The keywords obviously differ, but they don't affect assembler syntax; see gas.

Hm, I guess I see your point. It's pretty common to call different instruction sets different 'languages', though.

>It's also pretty much dead.

Yes, this makes me sad.



oh, also:

>There is no RISC architecture able to compete with x86 in field of performance. Prove me wrong.



It's good if you want to learn the basics of X86 assembly. The assembly you write in that book, however, is slower than what an optimizing C compiler would produce and the X86_64 System-V ABI is completely different. Afterwards you should probably read the books at http://www.agner.org/optimize/ to make the assembly you write worthwhile.


reccomended basics for comp eng. and fab?


>tfw no m68k pc to write assembly in


write for the sega genesis



tap tap tap. Just testing. ignore this.


File: 1473621873494.png (5.83 MB, 134x200, CADR The_Lisp_Machine MIT_Museum.jpg)

The first lisp machine (MIT CADR) had its microcode written in Lisp. The source code of the macro-operations can be found here


example with the MPY and DIV/DIV1/DIV1A/DIV2 instructions


(REPEAT 30. ((M-2) MULTIPLY-STEP M-2 A-1))


(REPEAT 31. ((M-1) DIVIDE-STEP M-1 A-2))
(A-TEM1) Q-R)


use /test/ for this.


Could you share the source or a binary?


Anybody here have any experience with Verilog? I'm going to learn it for a research project and it seems pretty crazy ... just about as low level you can get.


Verilog (and VHDL and other hardware description languages) is just a way to describe digital electronic circuits in FPGAs +some programming-like stuff.
Do you know digital electronics, anon?


I know AP-physics tier DC circuits and I'll probably be taking EE courses next semester.


I had a class using it, it can be pretty fun!


File: 1477240077180.png (2.33 MB, 200x148, 4004-composite-photo.jpg)

I've done some simple stuff in it, and touch it for work occasionally. As >>19562 says, learning will make more since with some (simple) context. Not electronics (well, understanding basic stuff like sourcing vs sinking, and tristate would be good, but not directly related), just some digital logic. You could learn the context in a week, really. Do you know what a Karnaugh Map is? How about how to construct a flop from nands?

Then, keep in mind:
- Verilog is not a programming language.
- Verilog is not low level, it supports design at multiple levels of abstraction.
- At a high level, it can be easy for beginners in VHDL to implicitly introduce sequential elements to what you intend to be combinational. I'm not sure if Verilog is better or worse at that, I just read it for work.


In basic structural Verilog, you're just listing out what gates are in your hardware and wiring them up. Dataflow Verilog lets you assign the output of a wire to a mathmatical expression using the output of several other wires as values. It looks kind of like C, but the variables must be constants. Procedural Verilog lets you write loops and conditional statements. You use it for writing hardware that uses memory or unsynthesizeable tests.


And this procedural verilog is what gets super beginners into trouble, from what I remember from college.

What they really want is something like (contrived example in fake syntax, my HDL experience is small and a while ago):

B := NOT A

What they write:
If (A) B := 0 else B := 1

Which gets them implied flops, and warnings for not declaring stuff in the trigger list, that beginners will ignore because they have no idea what it means. IIRC, which I may not, my experience is small.


Yes, procedural Verilog is where everything goes to soykaf. Fortunately, anything he'd have to do by hand he can easily do with structural Verilog.


I was reading my verilog guide and they were showing examples of if statement inside an always block, rather than just assigning wires and I was wondering why they do that... can anyone explain the distinctions on _when_ to use procedural VS structural?


I'll highlight a bit of information in case you missed it, I didn't write it, but I don't have a better answer than this myself.

>You use it for writing hardware that uses memory or unsynthesizeable tests.


any lainon doing stockfighter's jailbreak challenge?


You know with an FPGA and an HDL like VHDL or Verilog you can actually design a microprocessor and implement it onto a variable silicon substrate

What I'm saying is that if you go deeper you can actually write processors, I've got one of these, haven't gotten much more than the lights to flash though

pic related is a Basys 2 with a Spartan Field Programmable Gate Array, I don't recommend them though, as using it effectively is reliant on the xlinix software suite which is probably the most bloated piece of proprietary soykaf I've ever seen, I won't go full Stallman on you, but that thing makes Flash look efficient, FPGA are still really cool though


File: 1478323486678.png (267.85 KB, 200x113, maxresdefault.jpg)

Forgot the damn picture


File: 1478409952731.png (6.23 MB, 200x150, ice.png)

Yeah fpga's are great for getting a grip on hardware stuff since they're so malleable. There's been a few chips that have been reverse engineered. The hx1k and hx8k -pic related- even have have a complete open source toolchain:

Also there's a forth chip in verilog if you're into that:

A fpga is basically a dataflow machine which is cool if you don't want synchronus computation to be the default. It might also interest people that want to get into alternative computation but can't fab a chip -maybe try using muller c-gates and content-addressable memory:


me: if you want to see what dataflow feels like on your own skin, pick up an fpga and program in verilog for a bit

reader: I think, though, a huge missing piece of my understanding was just how radical are the changes that dataflow causes

me: it isn’t a general-purpose language by any means (the only data type is: the bit) but will give you a sense of the flavour.

reader: fpga and verilog: sounds better than writing spreadsheets at least… thanks for the tip

me: a terrible language, but will give you a feel for what it’s like to write a program where every line ‘executes’ simultaneously.

me: (because it is a circuit rather than a procedure)

me: the proprietary scripting language ‘mathematica’ supports dataflow. but, unsurprisingly, dog-slow.

reader: I always sort of thought that about CPU designs in general. whilst software designers say “how many cycles is this going to take?”, a hardware designer knows how many cycles something is going to take on a chip: one. or else

me: dataflow never caught on because you absolutely need dedicated hardware.
if you want real efficiency, you can’t even use standard RAM. so it gets dismissed as nuttery, on the rare occasions it comes up.

It's pretty cool that you don't have to use those bloated pieces of software for small to medium projects anymore. If you're interested this site is good for the basics -ie going from blinking lights to driving a vga or hdmi port and reading and writing an sd card:


Huh, I read that at least twice over the last three years, but I can only understand it just now.


File: 1482941925592.png (63.57 KB, 200x200, 1469729488190.jpg)

Can you suggest me an old machine that is fun to code and for which emulators are readily available?
I was thinking either the C64 or the Amiga, but I'd like to see more options.
Thanks a lot, love y'all.


C64/Amiga work. Or the Speccy, if you can get one.

Or just roll your own. It's not too hard to build a simplistic computer, so long as you don't bother with a screen (a terminal, or a computer that can function as one, will do if you don't mind the inability to render graphics).


>Or just roll your own. It's not too hard to build a simplistic computer
That's actually a good idea, I might just do that.


Whoa, this is really really interesting.
Never been so amused by logical operators this much


I honestly do think the C64 is the best, and also the easiest to set up.

Get cc65 and fuarrrk around with it.


Probably we need to open a new dedicated thread for this:


Some amazing things have been done with the C64 Demoscene, quadraSID music, C64 internet browser, C64 Doom...


Someone even made a UNIX clone for it.


Also apologies for double posting, but are 6502 ASM programs supposed to be so unclean? I don't know how to write this stuff properly.

Should a program that moves a sprite around on a black background really be 58 lines (not counting the sprite byte data I put at the end) ?


Yeah mang it's assembly