Translating a business problem into a quantum computation is the hardest step and where most projects fail. The process requires reformulating the problem mathematically, choosing how to encode variables into qubits, mapping the problem onto a Hamiltonian or circuit, and deciding which parts stay classical. Each decision involves tradeoffs that affect whether the quantum approach can outperform classical methods.
Translating Business Problems Into Quantum Programs
How to go from a business problem to a quantum circuit. Problem formulation, encoding choices, Hamiltonian mapping, and the classical-quantum boundary explained.
Where Projects Actually Fail
A materials science team at a national research lab spent three months building a quantum circuit for a lattice simulation. The circuit worked. It ran on hardware. It produced results. The results were useless, because the encoding they’d chosen required 4x more qubits than the problem actually needed. A different encoding, discovered by a postdoc reviewing the literature two weeks later, cut the qubit requirement in half and the circuit depth by 60%.
They hadn’t failed at quantum computing. They’d failed at translation.
The Real Failure Mode
Most quantum projects that produce disappointing results failed at translation, not at the quantum hardware. A different encoding discovered two weeks later cut the qubit requirement in half and circuit depth by 60%.
The step from “we have a business problem” to “we have a quantum circuit that solves it” is not a step. It’s a journey across four distinct stages, each with its own decisions, tradeoffs, and failure modes. Most quantum projects that produce disappointing results failed at one of these stages, not at the quantum hardware.
Stage 1: Mathematical Formulation
Every quantum computation begins with a precise mathematical statement of the problem. Not “optimize our logistics,” but “minimize the total weighted distance across N delivery routes subject to capacity constraints C1 through Cm, where the distance matrix D is given and the decision variables x_ij are binary.”
This sounds obvious, but the formulation determines everything downstream. Different mathematical formulations of the same business problem lead to radically different quantum circuits with different qubit requirements, different circuit depths, and different chances of success.
Consider portfolio optimization. The standard Markowitz formulation minimizes portfolio variance subject to a return constraint. This is a quadratic optimization problem. It maps naturally to a QUBO (Quadratic Unconstrained Binary Optimization) formulation, which maps naturally to quantum hardware.
But if your actual business problem is “select 20 stocks from a universe of 500 that maximize risk-adjusted return subject to sector exposure limits, liquidity constraints, and transaction cost bounds,” you have a mixed-integer nonlinear program. Getting from here to a QUBO requires a series of approximations: discretizing continuous variables, relaxing some constraints into penalty terms, and choosing how finely to represent each variable. Each approximation introduces error that has nothing to do with quantum hardware.
The formulation question is: does the mathematical structure of your problem, after all necessary approximations, retain enough of the original problem’s character to produce useful answers?
A good quantum computing team spends at least as much time on formulation as on circuit design. A bad one opens a quantum SDK on day one.
Formulation Determines Everything
Different mathematical formulations of the same business problem lead to radically different quantum circuits with different qubit requirements, circuit depths, and chances of success.
Stage 2: Encoding Variables Into Qubits
Once you have a mathematical formulation, you must represent its variables as qubits. This is the encoding step, and it’s where qubit counts either stay manageable or explode.
Binary Encoding
The simplest: one qubit per binary decision variable. For a problem with N binary variables, you need N qubits. Max-cut on a 50-node graph needs 50 qubits. Selecting items from a catalog of 200 needs 200 qubits.
This works for naturally binary problems. It breaks down when variables are continuous or integer-valued. Representing an integer that ranges from 0 to 255 requires 8 qubits in binary encoding (2^8 = 256 values). A continuous variable accurate to 0.01 over the range [0, 100] requires ceil(log2(10000)) = 14 qubits. Ten such variables: 140 qubits, just for the encoding, before you add ancilla qubits for computation.
One-Hot Encoding
An alternative: represent an integer variable that takes K values using K qubits, exactly one of which is in state |1>. This uses more qubits but produces simpler interactions between variables (often only 2-local, meaning each term involves at most two qubits).
Simpler interactions mean shallower circuits. Shallower circuits mean fewer errors. But more qubits mean more hardware. The tradeoff is real and problem-specific.
Compact and Problem-Specific Encodings
Researchers continuously discover encodings tailored to specific problem structures. Fermion-to-qubit mappings (Jordan-Wigner, Bravyi-Kitaev, and newer approaches) convert chemical simulation problems into qubit operators with different tradeoffs between qubit count and operator complexity. The Jordan-Wigner transformation preserves locality in one dimension but creates long-range interactions. Bravyi-Kitaev balances locality and operator weight. Newer encodings like the compact encoding by Derby and Klassen further reduce qubit overhead for specific molecular geometries.
The choice matters enormously. For a molecule like FeMoco (the active site of nitrogenase, a prime target for quantum chemistry), different encoding choices lead to estimates ranging from 200 to over 4,000 logical qubits. A 20x range, same molecule, same accuracy target, different encoding.
20x
Encoding Range
FeMoco: 200 to 4,000+ logical qubits
14
Qubits per Variable
Continuous var, 0.01 precision, range [0,100]
60%
Depth Reduction
From better encoding choice alone
Stage 3: Mapping to a Hamiltonian or Circuit
With variables encoded in qubits, the problem must be expressed as quantum operations.
The Hamiltonian Approach
For optimization and simulation problems, the standard approach is constructing a Hamiltonian, a mathematical operator whose lowest-energy state corresponds to the problem’s optimal solution.
For optimization: a QUBO problem with binary variables translates into an Ising Hamiltonian, a sum of terms involving single-qubit Z operators and two-qubit ZZ operators. The coefficients come directly from the objective function and penalty terms for constraints. This Ising Hamiltonian is then used as the cost function in QAOA or as the target for quantum annealing.
For chemistry: the electronic structure Hamiltonian describes the interactions of electrons in a molecule. After applying a fermion-to-qubit mapping, you get a sum of Pauli operators (products of X, Y, Z matrices) acting on the qubits. The number of terms in this sum scales polynomially with the number of orbitals but can be large: a modest molecule might produce a Hamiltonian with thousands of terms.
Each term in the Hamiltonian must be measured separately (or grouped into commuting sets that can be measured simultaneously). More terms mean more measurements mean more time on the quantum processor. Techniques for grouping commuting terms and reducing measurement overhead are an active research area that directly impacts practical runtimes.
The Circuit Approach
Some problems map more naturally to quantum circuits than to Hamiltonians. Cryptographic attacks (Shor’s algorithm) use specific circuit structures designed around the mathematical properties of modular arithmetic. Quantum walks for graph problems construct circuits that propagate quantum amplitudes through the graph structure.
Designing these circuits is algorithmic work, closer to algorithm design than to the “formulate, encode, map” pipeline used for optimization and simulation. It requires deep quantum computing expertise and typically targets well-defined mathematical problems rather than business applications.
Stage 4: The Classical-Quantum Boundary
Here is where the architect’s judgment matters most. In every practical quantum computation today, the quantum processor handles a fraction of the total work. The rest, data preprocessing, parameter optimization, post-processing, result validation, runs on classical hardware.
Deciding what goes on the quantum processor and what stays classical is a design decision with major consequences.
Too much on the quantum side: Circuit depth grows, errors accumulate, results degrade. A team at a European bank described running a full portfolio optimization on a quantum processor: 200+ qubits, deep circuit, results worse than random. They restructured the computation to use the quantum processor only for estimating a specific correlation structure, feeding that estimate into a classical optimizer. The hybrid approach, while not yet beating fully classical methods, produced meaningful results.
Too little on the quantum side: If the quantum processor is only evaluating a trivial subroutine, the quantum overhead (in time, cost, and complexity) exceeds any potential advantage. You’ve built a quantum-classical system that’s slower than a purely classical one.
The pattern that works in practice (for variational algorithms): the classical optimizer handles the high-level search, the quantum processor evaluates a cost function that is specifically hard for classical computers, and the classical post-processor aggregates results from multiple quantum runs.
For chemistry problems, the natural split is: classical methods handle the molecular geometry and orbital selection (identifying which electrons and orbitals are most important), the quantum processor computes the electronic energy of the active space (the classically hard part), and classical methods embed this result back into the full molecular context.
For optimization problems, the split is less clear and often less favorable to the quantum side, because classical optimization heuristics are remarkably good at the scales quantum hardware can currently handle.
A Worked Example: Molecular Binding Energy
To make this concrete, consider computing the binding energy of a small drug-like molecule interacting with a protein pocket. This is a real pharmaceutical problem with genuine commercial value.
Business problem: Will this candidate molecule bind strongly to the target protein? Estimating binding energy accurately (within 1 kcal/mol) is worth billions in pharmaceutical R&D because it predicts drug efficacy before expensive synthesis and testing.
Mathematical formulation: Compute the ground-state electronic energy of the molecule-protein complex and subtract the energies of the separated molecule and protein. The electronic energy comes from solving the Schrodinger equation for the electrons in the combined system.
Encoding decision: The full system has thousands of electrons, far too many for any quantum computer. A classical active-space calculation identifies the 20-40 most important orbitals (those involved in the binding interaction). These are encoded using a fermion-to-qubit mapping. Using the Jordan-Wigner transformation: 40 orbitals become 40 qubits. Using Bravyi-Kitaev: still 40 qubits, but with different operator structures.
Hamiltonian construction: The electronic Hamiltonian for 40 orbitals produces roughly 100,000-500,000 Pauli terms. After grouping commuting terms, perhaps 10,000-50,000 measurement groups. Each group requires hundreds to thousands of measurement shots for statistical convergence. Total quantum processor time: hours to days on current hardware, potentially minutes with error mitigation and efficient measurement schemes.
Classical-quantum boundary: The classical side handles orbital selection, basis set construction, measurement grouping, statistical aggregation, and embedding the active-space result into the full molecular context. The quantum side evaluates the ground-state energy of the active space. This is the classically hard part: exact classical methods for 40 orbitals scale as 2^40 (about a trillion), which is on the boundary of classical feasibility but would take days on a supercomputer. A quantum computer with sufficient quality qubits could do this in polynomial time.
Current gap: Running this on today’s NISQ hardware with 40 noisy qubits and no error correction produces results dominated by noise. The chemical accuracy target of 1 kcal/mol requires circuit depths that exceed current coherence limits. This problem is a strong candidate for early fault-tolerant machines with 50-100 error-corrected logical qubits.
Exact methods for 40 orbitals scale as 2^40 (~1 trillion). Takes days on a supercomputer. Feasible but slow.
Polynomial time with sufficient quality qubits. Needs 50-100 error-corrected logical qubits. Strong candidate for early fault-tolerant machines.
The Formulation Skill Gap
Here’s something the quantum industry doesn’t talk about enough: the scarcest resource in quantum computing is not qubits. It’s people who can translate real-world problems into good quantum formulations.
This translation requires three kinds of expertise simultaneously: domain knowledge (understanding the actual business problem and what constitutes a useful answer), mathematical maturity (reformulating the problem into a structure that maps efficiently to quantum operations), and quantum computing knowledge (understanding which encodings, circuit structures, and algorithms work best for which problem types on which hardware).
Finding one person with all three is rare. Building a team that covers all three is expensive but feasible. Attempting the translation without all three covered is how projects produce circuits that are technically correct but practically useless.
The pharmaceutical team that found a better encoding two weeks after completing their circuit wasn’t negligent. They were encountering a research frontier where the optimal approach isn’t yet known for most practical problems. Translation from business problem to quantum circuit is an active area of research, not a solved problem with best practices.
For technical leaders evaluating quantum projects: the translation stage is where you should apply the most scrutiny. Ask what alternative formulations were considered. Ask why this encoding was chosen over others. Ask where the classical-quantum boundary was drawn and what happens if you move it. The answers will tell you more about the project’s quality than any benchmark number.
Key Takeaways
- Translation from business problem to quantum circuit has four stages: mathematical formulation, qubit encoding, Hamiltonian/circuit mapping, and classical-quantum boundary design.
- Encoding choices alone can create a 20x difference in qubit requirements for the same problem.
- The scarcest resource is people who combine domain knowledge, mathematical maturity, and quantum computing expertise.
- Scrutinize the translation more than the hardware. Ask what alternatives were considered at each stage.