Blog

A brief history of the mortgage factory problem

February 10, 2026
Crayton Montei
Crayton Montei
The mortgage factory problem: how we got here

For nearly a century, mortgage origination has taken place via the mortgage factory model: Applications move from intake to processing to underwriting to closing, passing through specialized departments and dedicated teams at each stage. This model has achieved impressive scale, originating trillions of dollars in loans, financing hundreds of millions of homes, and enabling mortgages to become an anchor of the American financial system. 

Yet the mortgage factory model has increasingly fallen out of step with today’s economic and technological realities. While adjacent industries have leveraged technology to drastically lower costs, the cost of mortgage origination has steadily climbed, nearly doubling in the past decade alone. This trend has persisted despite decades of attempts at automation and billions of dollars in technology investment, leaving originators with little more than hope that an answer may be found in the next point solution or incremental automation.

If we look to history, we can see that the mortgage factory model embodies a set of outdated assumptions about the manual, chronological, and fragmented nature of key mortgage workflows. While necessary in the 1940s, these assumptions are at odds with the kinds of automation that have enabled other industries to drive down costs. By bolting new technologies onto the mortgage factory’s fundamentally human-centered model, the industry has unwittingly doubled down on the very obstacle to modernization.

The mortgage factory wasn’t a mistake; it was the right solution for its time, designed for an era when standardization was revolutionary and human judgment was non-negotiable. But 80 years later, things have changed. This is the story of how the mortgage factory originally came to be, where it has managed to evolve, and why it has finally reached its breaking point.

The assembly line is born (1930s-1940s)

Prior to the Great Depression, the American mortgage system was fragmented and precarious. Mortgages typically required 50% down payments, carried terms of just 3 to 5 years, and ended with large balloon payments that forced repeated refinancing. When the Depression hit and property values collapsed, the system imploded. By 1933, the foreclosure rate had reached 13.3 per 1,000 mortgaged homes, and by early 1934, half of all urban home mortgages were delinquent.

The federal response fundamentally reshaped how mortgages worked, including how they would be originated. The Federal Housing Administration (FHA), created in 1934, didn’t just insure mortgages; it standardized them. The FHA established uniform appraisal methods, property standards, and underwriting guidelines. For the first time, there was a common framework for evaluating creditworthiness and property value across the entire country.

This standardization made mortgages tradable, which required a secondary market. In 1938, the government created the Federal National Mortgage Association (later known as Fannie Mae) to purchase FHA-insured loans from lenders, providing them with the liquidity to make more loans. The conveyor belt was in place: lenders would originate loans according to FHA standards, then sell them to Fannie Mae.

This created both an opportunity and a challenge. Before the FHA, each lender had to develop its own underwriting criteria and bear the full risk of default, which limited how many loans any single institution could safely make. Uniform requirements meant lenders could now originate loans with confidence they could sell them to Fannie Mae, transferring the risk off their books and freeing up capital to make more loans. But originating at this higher volume would require a much more efficient process. The solution was to break origination into discrete, specialized steps with different people responsible for each step. It was a manufacturing approach, borrowed from the industrial efficiency movements of the era. Each station had its own procedures, checklists, and quality controls.

This design made perfect sense at the time. Mortgages were inherently complex, involving unstructured documents, property inspections, income verification, and credit assessment. Each loan represented a unique combination of borrower circumstances and property characteristics. The work required human judgment at every step: someone to review tax returns, someone to interpret appraisals, someone to verify employment. Throughout the process, different objectives had to be balanced, ensuring credit quality while serving borrowers, managing risk while maintaining efficiency, meeting regulatory requirements while controlling costs. The assembly line didn’t just organize the work; it was built around the assumption that humans would perform it.

The template was set. For the next eight decades, origination would continue to depend on this basic model of specialized human workers processing loans through sequential stages.

The factory scales and specializes (1950s-1970s)

The postwar housing boom put the mortgage factory into high gear. The GI Bill enabled millions of veterans to purchase homes with no down payment, while rising income and suburban expansion created unprecedented demand. Between 1940 and 1960, the homeownership rate jumped from 44% to 62%.

As volume grew, so did specialization. The roles that had emerged in the 1940s became distinct professions with their own standards and workflows. Loan processors developed expertise in document collection and file preparation. Underwriters became specialists in guideline interpretation and risk assessment. Closers mastered the coordination of title companies, attorneys, and escrow agents. Each role had its own training, its own career path, and its own set of tools, primarily paper files, typewriters, and telephone calls.

Physical loan files moved between desks like products on a factory floor. A processor would gather documents, organize them in a folder, and pass the file to underwriting. The underwriter would review, make a decision, and send it to closing. After closing, the file moved to post-closing, then to delivery. Though this is a simplified overview, it illustrates how the sequential, human-powered nature of the process became deeply embedded in how the industry operated.

In 1970, Congress created Freddie Mac to provide additional secondary market liquidity, and the mortgage-backed securities market began to develop in earnest. This financial innovation further cemented the factory model. To sell loans into the secondary market, lenders had to demonstrate that each loan met specific guidelines, which meant more documentation, more verification, and more specialized human review at each stage.

Mortgage factory timeline

Technology bolted onto the assembly line (1980s-1990s)

The 1980s brought the first wave of computerization to mortgage origination. Fax machines accelerated document transmission between offices. Early credit bureaus provided faster access to credit reports. Automated valuation models supplemented traditional appraisals.

While these tools enhanced human productivity, they didn’t replace human workers. The fax machine meant a processor could receive documents faster, but someone still had to review them. Credit reports arrived electronically, but an underwriter still had to interpret them.

The most significant development was the emergence of loan origination systems (LOS), software platforms designed to track and manage loan files. These systems digitized the paper trail, allowing lenders to monitor where each loan was in the process and maintain electronic records of key data points. The LOS didn’t change the fundamental workflow; rather, it tracked the same sequential handoffs that had always existed, from application to processing to underwriting to closing.

In the mid-1990s, Fannie Mae and Freddie Mac introduced Desktop Underwriter and Loan Prospector, automated underwriting systems (AUS) that could evaluate borrower creditworthiness and provide a preliminary decision in minutes rather than days. This was a genuine technological breakthrough. For the first time, software was making credit decisions based on fixed rules and standardized data inputs.

Yet even this innovation was absorbed into the existing assembly line. AUS became another station in the process: loan officers would submit files to Desktop Underwriter, receive findings, and then pass the file along to human underwriters who would validate the results, evaluate exceptions, and make final decisions. The technology handled straightforward scenarios with predictable outcomes, but any case requiring nuanced judgment, including self-employed borrowers, complex asset situations, or property complications still needed human interpretation. The technology was advisory, not authoritative. Human judgment remained central.

By the end of the 1990s, each department had its own software tools. Loan officers had point-of-sale systems for taking applications. Processors had document management platforms. Underwriters had AUS systems and guideline libraries. But these systems rarely talked to each other. Data had to be manually re-entered from one system to the next. The result was the emergence of silos: separate software ecosystems that reflected the factory’s departmental structure.

The “automation myth” took hold during this era: the belief that technology would make mortgage origination faster and cheaper by making human workers more efficient. Ultimately, augmenting a human-centered process with disconnected software tools was more successful in breeding additional complexity than in delivering true automation.

Customer Preview Background
Ultimately, augmenting a human-centered process with disconnected software tools was more successful in breeding additional complexity than in delivering true automation.

The explosion of point solutions (2000s-2010s)

The 2000s saw an explosion of specialized software products designed to solve specific problems in the origination process. There were point solutions for income verification, employment verification, asset verification, fraud detection, and compliance. There were tools for appraisal management, title ordering, and flood certification. There were document generation engines, electronic signature systems, and quality control solutions.

Each of these tools addressed a real pain point, and each made a specific task easier. But collectively, they created a new problem: integration complexity. A typical enterprise lender could be using 15-20 different vendor solutions, each requiring separate implementation, training, and maintenance. More problematic, these systems needed to exchange data with each other and with the core LOS, but there were no industry standards for how data should be structured or transmitted.

The result was a massive investment in point-to-point integrations. IT teams spent enormous resources building custom connections between systems. A change to one system often broke connections to others. Data flowed through the network inconsistently, requiring constant human oversight to catch errors and reconcile discrepancies. Rather than being replaced by technology, human workers became the integration layer, the people who manually moved information between systems and resolved the gaps.

The 2008 financial crisis and the resulting Dodd-Frank regulations added another layer of complexity. New compliance requirements meant additional documentation, additional disclosures, and additional quality controls at multiple points in the process. This translated into more software (compliance engines, disclosure platforms, audit tools) and more specialized human roles (compliance officers, quality control analysts, regulatory coordinators). The assembly line grew longer even as it grew more intricate, with conditional branches and recursive loops.

By 2010, a mortgage that touched 20+ different software systems and passed through 30+ human handoffs had become typical. And despite all this technology investment, origination costs were rising. In 2015, the average cost to originate a mortgage was $7,046 per loan. By Q1 2025, it had increased 78% to $12,579. The efficiency paradox had become undeniable: more point solutions, yet higher costs.

In the late 2010s, AI-powered point solutions began appearing in the mortgage space, including tools for document classification, data extraction, and fraud detection. These systems could handle specific, well-defined tasks like reading a W-2 or categorizing a bank statement. But they operated within the same fragmented infrastructure, and they couldn’t handle the kind of complex, contextual reasoning that mortgage origination required: evaluating a borrower’s complete financial picture, interpreting guidelines in ambiguous situations, and making holistic credit decisions. The technology simply wasn’t sophisticated enough to take over the judgment-heavy work that humans performed.

The assembly line had become a maze, even though the basic model of sequential stages, human processing, and disconnected systems had remained unchanged since the 1940s.

Today’s mortgage factory: a system at a breaking point

Today, in the mid-2020s, the industry’s approach to mortgage origination still follows the same fundamental pattern established 80 years ago: application, processing, underwriting, closing, delivery. The same specialized roles exist. The same sequential handoffs occur. The same human judgment is required at each stage.

What has changed is the sheer complexity of how the work gets done. A single mortgage touches more than 20 separate software systems. Data is re-entered multiple times as it moves from the POS to the LOS to various vendor platforms and back. Loan officers toggle between multiple screens to gather information. Processors manage order-outs through disparate vendor portals. Underwriters copy and paste data between systems to conduct their analysis.

This complexity creates a paradox: despite billions invested in technology, mortgages still usually take 30-45 days to close. The timeline is consumed by queues—loans waiting for the next available processor, waiting for the next available underwriter, waiting for third-party vendors to complete their work, waiting for documents to be gathered and reviewed. 

But it’s the cost trajectory that tells the story most clearly. Today’s average origination cost of about $12,000 per loan isn’t the result of insufficient technology investment; it’s the consequence of layering increasingly sophisticated tools onto a fundamentally analog framework. Each point solution adds integration complexity. Each integration requires human oversight. Each human handoff introduces delays and potential errors. The more technology gets added, the more expensive the system becomes.

Meanwhile, adjacent industries have drawn a stark contrast to mortgages. Insurance underwriting that once took weeks now happens in minutes. Consumer lending that required branch visits now closes instantly on mobile devices. Payment processing that relied on checks and multi-day bank transfers now settles in seconds. Stock trades that required phone calls to brokers now execute in milliseconds. These industries haven’t achieved transformation by adding more tools to manual processes; they’ve rebuilt their infrastructure from scratch around modern technology.

For the mortgage industry to achieve this same kind of transformation, it must acknowledge that the mortgage factory model contains a set of fundamental assumptions: that origination involves sequential and discrete steps, that human judgment is needed at each of these steps, and that origination is too complex to be automated. These assumptions made perfect sense in 1940, but today they prevent us from graduating to a new origination paradigm: one that can deliver mortgages at a speed and cost worthy of the era of technological progress we live in. 

The question facing the industry isn’t which point solution may finally deliver meaningful savings, or which vendor integration will reduce bottlenecks. It’s whether to continue building on top of a mortgage factory model designed for the constraints of 1940, or to embrace a new kind of infrastructure designed from the ground up for the capabilities and the economics of 2026.

The mortgage factory wasn’t wrong for its time. But its time has passed.

Sources

Mortgage News Daily. “Mortgage Banking Profits Increased in 2016.” April 13, 2017.

Mortgage Bankers Association. “IMBs Report Slight Production Losses in First Quarter of 2025.” May 16, 2025.

Wheelock, David C. “The Federal Response to Home Mortgage Distress: Lessons from the Great Depression.” Federal Reserve Bank of St. Louis Review, May/June 2008.

Snowden, Kenneth A. “The Anatomy of a Residential Mortgage Crisis: A Look Back to the 1930s.” NBER Working Paper No. 16244, National Bureau of Economic Research, 2010.

Frame, W. Scott, and Lawrence J. White. “Fussing and Fuming over Fannie and Freddie: How Much Smoke, How Much Fire?” Journal of Economic Perspectives 19, no. 2 (2005): 159-184.

Fetter, Daniel K. “How Do Mortgage Subsidies Affect Home Ownership? Evidence from the Mid-Century GI Bills.” American Economic Journal: Economic Policy 5, no. 2 (2013): 111-147.

U.S. Census Bureau. “Historical Census of Housing Tables – Homeownership.”

Gates, Susan Wharton, et al. “Automated Underwriting in Mortgage Lending: Good News for the Underserved?” Housing Policy Debate 13, no. 2 (2002): 369-391.

HousingWire. “How AI is already transforming and improving the mortgage underwriting process.” 2025.

Engel, Kathleen C., and Patricia A. McCoy. “The Subprime Virus: Reckless Credit, Regulatory Failure, and Next Steps.” Oxford University Press, 2011.

Freddie Mac. “Mortgage Closing Cycle Time.” December 2020.

Blade
Blade
Blade

Ready to get started?

Start now