
The Government Accountability Office (GAO) is calling for more federal oversight of artificial intelligence (AI)-powered tools that are reshaping the homebuying process, warning that gaps in transparency and oversight could expose prospective homebuyers to risks.
In a Monday report, GAO said that digital tools known as “proptech” are increasingly being used in real estate – including automated valuation models, automated underwriting systems, online real-estate platforms, and electronic closing products.
While those tools offer convenience and speed for buyers and lenders, GAO warned that the AI-enabled tools may deepen inequities across the housing market if left unchecked by exposing consumers to potential discrimination, data privacy risks, and distorted housing prices.
“Nearly all homebuyers use online real estate platforms to search for homes and some may use other services and features, such as valuation models, to get home price estimates,” GAO explained. “Lenders and the enterprises use automated valuation models for multiple functions, including estimating the market value of properties. The enterprises developed their own automated underwriting systems used by nonbank mortgage lenders.”
To address those concerns, GAO said that federal agencies, including the Federal Housing Finance Agency (FHFA), the Consumer Financial Protection Bureau, Federal Trade Commission, and Departments of Veterans Affairs and Housing and Urban Development (HUD), should revise compliance requirements and their supervision of companies using those tools to ensure fair lending when homebuyers apply for a mortgage.
Despite allowing the use of AI-powered tools, oversight agencies have few to no policies regulating the use of AI-powered tools, GAO reported.
For example, automated valuation models may perpetuate discrimination by using historical data, GAO found.
“While automated valuation models do not take into account the race of participants in individual transactions, they may perpetuate valuation disparities by continuing to undervalue properties in historically undervalued communities,” GAO said, adding that while “industry-sponsored tests have found that models may estimate home values more reliably than traditional appraisals, other research has found that reliance on historical housing price data makes it difficult to correct for the effects of past discrimination embedded in these data.”
Limited data can also reduce the reliability of AI-powered valuation tools, with GAO citing one company that told the agency its automated models can only estimate values for around 85% of all U.S. properties and lack sufficient data for the remaining 15%.
And while AI is not widely used in underwriting systems – used to help lenders assess eligibility and credit risk when making loan decisions – its use could complicate a lender’s ability to explain why a loan was denied, as required by law, due to AI’s “black box” nature. That makes it difficult for borrowers to understand why their applications were denied and for lenders to comply with regulations, GAO explained.
Some progress has been made in the industry to address these risks. For example, the online real estate marketplace Zillow developed a tool to identify potentially discriminatory language in searches and chatbots.
Meanwhile, in government, HUD investigated algorithmic steering and discriminatory ad-targeting practices, while also issuing new guidance clarifying liability for digital-platform housing advertisements under the Fair Housing Act.
The FHFA also began shifting its fair-lending supervision of Fannie Mae and Freddie Mac – government-sponsored enterprises overseen by FHFA that buy mortgages from lenders and that started using AI in their models in 2018 and 2020 – under new 2025 policy directives, but it has not yet clarified its updated compliance expectations, GAO said.
GAO recommended that FHFA provide direction to Fannie Mae and Freddie Mac so that they “clearly understand FHFA’s supervisory expectations and compliance requirements, which are intended to help promote sustainable housing opportunities for underserved communities.”
FHFA did not agree or disagree with the recommendation.