My dad worked for HP from the mid-1970s through the mid-1990s. Needless to say, I used HP calculators in high school and college. The best things about having an HP calculator were the solid physical construction (the buttons on the 11C and 15C were awesome), the accuracy, and the fact that whenever your classmates asked to borrow your calculator they would recoil in horror when you asked them whether they knew RPN. Nobody borrowed my calculator. Anyway, I love this project.
My biggest challenge the first time I ever used an HP calculator was less RPN than the syntax of it. I thought I had to hit enter after every token so I typed, e.g., 2โ3โ+โ rather than 2โ3+. Needless to say, this did not work as expected, but being simultaneously vain and bashful, I was unwilling to ask for help and did almost all the arithmetic for my freshman physics class by hand.
I still have my dads old HP with the glowing red letters and all the functions. Not sure if we still have the charger. Not sure the battery is any good, but the calculator worked fine last time it was turned on decades ago. Any idea if this can be made to function again?
Are you trying to make a pun with byte/bite relating to nibble? Because that's actually where the term nibble (referring to 4 bits) comes from, so I'm not sure such a pun even counts as a pun anymore. Or am I misinterpreting your comment?
The definition of a byte today is different than the definition of byte when those machines were manufactured. Just like how 'foot' is now standardized(*)
(* technically, a 'foot' is not a standard unit of measure but that's due to the long history of 'foot' not being standardized until relatively recently)
The core question: how did HP's scientific calculators actually work at the gate level? That rabbit hole led to building one from scratch.
The architectural decision everything else follows from: a decimal calculator should store numbers as BCD โ one decimal digit per 4-bit nibble. A standard byte-oriented CPU (Z80, 6502) fights that layout constantly. So I designed a small custom CPU in Verilog where 4 bits is the natural data width and memory is nibble addressable.
What the project covers:
- Custom CPU: Harvard architecture, 12-bit ISA, 8-state execution
FSM, hardware stack guard with a FAULT state for microcode debugging
- CORDIC for trig functions, verified to 14 significant digits
- Two-pass assembler in Python (~700 lines)
- Verilator + Qt framework: same Verilog source runs in simulation,
as a desktop GUI debugger, as WebAssembly, and on real hardware
- Scripting language on top of the microcode for adding functions
without touching hardware
Ironically the Z80 is a nibble ALU. That's why its so slow compared to the competition, an 8 bit add on a "2 MHz" Z80 takes as much clock time as a 8 bit add on a "1 MHz" 6809.
My dad worked for HP from the mid-1970s through the mid-1990s. Needless to say, I used HP calculators in high school and college. The best things about having an HP calculator were the solid physical construction (the buttons on the 11C and 15C were awesome), the accuracy, and the fact that whenever your classmates asked to borrow your calculator they would recoil in horror when you asked them whether they knew RPN. Nobody borrowed my calculator. Anyway, I love this project.
My biggest challenge the first time I ever used an HP calculator was less RPN than the syntax of it. I thought I had to hit enter after every token so I typed, e.g., 2โ3โ+โ rather than 2โ3+. Needless to say, this did not work as expected, but being simultaneously vain and bashful, I was unwilling to ask for help and did almost all the arithmetic for my freshman physics class by hand.
> the buttons on the 11C and 15C were awesome
What is the trick to engineering HP calculator keys? Nobody gets keys right like the old HP calculators.
In this age of 3D printing and fast prototypes, we really ought to be able to crack this.
I still have my dads old HP with the glowing red letters and all the functions. Not sure if we still have the charger. Not sure the battery is any good, but the calculator worked fine last time it was turned on decades ago. Any idea if this can be made to function again?
If the CPU is nibble-oriented, wouldn't that mean that that is its byte size?
Are you trying to make a pun with byte/bite relating to nibble? Because that's actually where the term nibble (referring to 4 bits) comes from, so I'm not sure such a pun even counts as a pun anymore. Or am I misinterpreting your comment?
When did we stop spelling it "nybble"?
A byte is always 8 bits. The word you're looking for is `word-size` which, in this case would be 4 bits.
A byte is not always 8 bits on old machines, though it is standardised as 8 bits nowadays.
This is why network RFCs talk of "octets", to avoid the ambiguity. Octets are always 8 bits.
https://en.wikipedia.org/wiki/Octet_(computing)
I didn't realize that there was a 16 bit name called a 'chomp' haha. But more formally hextet.
The definition of a byte today is different than the definition of byte when those machines were manufactured. Just like how 'foot' is now standardized(*)
(* technically, a 'foot' is not a standard unit of measure but that's due to the long history of 'foot' not being standardized until relatively recently)
The core question: how did HP's scientific calculators actually work at the gate level? That rabbit hole led to building one from scratch.
The architectural decision everything else follows from: a decimal calculator should store numbers as BCD โ one decimal digit per 4-bit nibble. A standard byte-oriented CPU (Z80, 6502) fights that layout constantly. So I designed a small custom CPU in Verilog where 4 bits is the natural data width and memory is nibble addressable.
What the project covers:
- Custom CPU: Harvard architecture, 12-bit ISA, 8-state execution FSM, hardware stack guard with a FAULT state for microcode debugging
- CORDIC for trig functions, verified to 14 significant digits
- Two-pass assembler in Python (~700 lines)
- Verilator + Qt framework: same Verilog source runs in simulation, as a desktop GUI debugger, as WebAssembly, and on real hardware
- Scripting language on top of the microcode for adding functions without touching hardware
- Custom PCB (EasyEDA/JLCPCB), battery, charging circuit
Write-up: https://baltazarstudios.com
Hackaday: https://hackaday.com/2026/05/13/build-the-cpu-then-build-the...
At least the 6502 has a BCD mode built in!
Ironically the Z80 is a nibble ALU. That's why its so slow compared to the competition, an 8 bit add on a "2 MHz" Z80 takes as much clock time as a 8 bit add on a "1 MHz" 6809.