Tensorware: The Third Medium of Logic
1. Hardware
We build logic by wiring together physical primitives—transistors, gates, flip-flops. Inputs are voltages; outputs are voltages. The logic is baked into the structure itself.
2. Software
We express logic through instructions that manipulate memory. A function takes inputs, applies explicitly coded rules, and produces outputs. The logic is baked into sentences.
Both are mediums for expressing and executing logic. But today, a third medium is quietly emerging.
Enter Tensorware
Tensorware is the idea that logic can be implemented as a tensor transformation—a learned or intentionally constructed arrangement of numerical operations inside a machine learning model.
In other words: Tensorware is logic encoded not in circuits or instructions, but in the shape and parameters of a tensor function. It sits alongside hardware and software as a new substrate for computation.
How Tensorware Works
Instead of writing if statements or wiring gates, you design a model architecture whose forward pass becomes the logic. For simple transformations, you can directly set the parameters. For more complex ones, you train them—letting data sculpt the logic into the network.
Inputs become vectors, matrices, or higher-order tensors. Outputs do the same. The logic emerges through tensor multiplication, activation functions, and learned structure.
A Useful Analogy
| Medium | How Logic Is Implemented | How Logic Is Changed |
|---|---|---|
| Hardware | Physical arrangement of gates | Fabrication |
| Software | Human-written instructions | Editing code |
| Tensorware | Tensor transformations (learned or set) | Training |
Tensorware is not replacing hardware or software. But it represents a new category; one where logic becomes mathematical structure shaped by data, rather than an engineered sequence.
Closing Thought
We’re used to thinking in two dimensions: hardware and software. But machine learning has introduced a third medium; one that doesn’t follow our instructions, but instead learns the logic we intend. Tensorware is the conceptual bridge that helps us talk about this shift. It’s not just models. It’s not just weights. It’s a new way to build systems.
What does that really mean?
Implementation of AND logic
To implement AND in hardware, you have a circuit board and start putting elements together.
To implement in software, you open up your favorite IDE, and start typing:
def and(a, b):
return a and b
In tensorware you start designing a neural network. How many layers do you need? Activation funcitons. What type of layers (dense, convolutional, recurrent, ...). What is your training data, how are you going to train, what is your objective function? etc...
The Strategic Implication: A Shift Away From Programming
We’ve seen this pattern before in technology. Early computer graphics required engineers to program every detail: rendering pipelines, lighting equations, animation curves. Producing CGI was essentially a software engineering task. But with the arrival of 3D Studio Max, Maya, and later modern game engines, the medium changed. Creation didn’t rely on low-level programming anymore; it relied on using higher-level tools that encapsulated the complexity.
Tensorware is on a similar path. Today, implementing logic through tensors still feels like a specialist activity involving model definitions, training loops, and careful parameter tuning. But the long-term direction is clear: organizations will increasingly leverage tensor-based logic through platforms and tooling rather than programming. The complexity will be abstracted, not by replacing people, but by raising the level at which they operate.
Just as graphics moved from code to creation, logic will increasingly move from software to Tensorware.


Comments
Post a Comment