MeriTalk Improving the Outcomes of Government IT Thu, 18 Jan 2018 17:05:12 +0000 en-US hourly 1 Are You Complying with the Executive Order on Cybersecurity? Wed, 13 Dec 2017 20:07:25 +0000 In May 2017, the President issued an Executive Order on Cybersecurity. Among other requirements, the order holds agency heads accountable for appropriate cyber defenses:

“Agency heads will be held accountable by the President for implementing risk management measures commensurate with the risk and magnitude of the harm that would result from unauthorized access, use, disclosure, disruption, modification, or destruction of IT and data.”

Presidential Executive Order on Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure, May 11, 2017. The order also mandates the use of the NIST Cybersecurity Framework ( in managing cybersecurity risk.

The Threats

Is your agency complying with the order? Over the last few years, cybersecurity threats have evolved from traditional data theft into data destruction and ransomware attacks. These newer threats are vastly different because they directly impact and threaten the operation of IT infrastructures.

Ransomware itself has grown into a significant threat. Originally, ransomware might attack and encrypt a single endpoint, such as a desktop. To expand their reach, criminals quickly added capabilities to encrypt not just the host, but other devices and shares that could be accessed from the host; and then started moving laterally to other devices before starting the encryption process. These new capabilities threatened not only production servers and data, but also the backup infrastructure.

A Risk-Adjusted Approach

These new threats are precisely the type of attacks contemplated by the classic evaluation approach required in the order:  assessing both the risk (probability) and magnitude (scope of harm) for threats.

The risk of a ransomware or destructive attack is high – 60 percent of organizations were hit by ransomware in 2016. By mid-2017, more than half of organizations had been hit by a ransomware attack at least twice (Druva Ransomware Report 2017).

The magnitude of harm from a ransomware or destructive attack can also be significant.  In 2016, a utility in the Midwest spent about $2.5M recovering from a single attack, and a healthcare organization estimated its losses and recovery costs at $10M. In the for-profit space, several organizations reported losses of over $200M from the NotPetya attacks of June 2017.

What Should You Do?

The high risk and magnitude of ransomware and destructive threats means that agencies must address this problem under requirements of the order.

Many experts, including several who work for federal agencies, recommend a backup as the best defense to a ransomware attack.  So while a layered defense including employee training, anti-malware and other defenses are important, the specific response to a ransomware threat requires a “recover” strategy as outlined in NIST’s Cybersecurity Framework.

The backup strategy itself can be tailored to match the value of the data. We recommend protecting data for cybersecurity recovery at different levels, based upon its criticality. Three levels of protection – in increasing levels of security – can be deployed:


  • Standard backup – including a backup for data stored in the cloud;
  • Backup stored on a hardened backup infrastructure – deploying least privilege, network segmentation, securing CIFS / NFS shares (or better yet, using a different protocol such as Boost), deploying retention lock / WORM capabilities, etc.; or
  • An isolated backup protected with an operational, logical “air gap.” This backup infrastructure is minimally exposed to the outside world and ready for quick restores (unlike a tape backup which is undependable and can have a very long recovery time).


Cyber defenses should match the threats that we are all seeing today. Make sure that your infrastructure is protected from ransomware and destructive attacks with appropriate defenses.

Dell EMC recommends Data Protection Backup and Isolated Recovery for efficient, cost effective backup and recovery.

Learn more, here:

Data Protection Backup:

Isolated Recovery:

]]> 0
AI is Dragon Glass and Valyrian Steel–Cyber Winter Is Here Wed, 06 Dec 2017 18:27:19 +0000 Kevin Cox is Jon Snow in the war against cyber threats–for the cyber night is here and full of terrors. And, winter isn’t coming–it’s already here. As Continuous Diagnostics and Mitigation (CDM) rounds out phases I and II, our government needs automation, artificial intelligence (AI) and machine learning (ML) to hold back the ugly cyber hoards. AI, ML, and cloud are the dragons, dragon glass, and Valyrian steel that the Department of Homeland Security (DHS) needs to combat the hacking white walkers. Okay, so how to separate fact from fairytale?

AI is Changing the Cyber Game

The web emits 10 million new malware files every month. Do the math. Yesterday’s signature-based approach to combating malware–deriving signatures one file at a time–is a losing proposition.

For most security scenarios, AI enables capabilities that go far beyond identifying known threats. AI models can determine a file’s maliciousness with no previous knowledge of the file, relying instead on analysis of the file’s innate properties.

As if this dated approach to malware detection isn’t bad enough, 60 percent of current intrusions are malware-free in nature. Hackers leverage memory-only threats and living-off-the-land techniques like the use of legitimate Windows tools, such as PowerShell and WMI. It’s significant to note that the most sophisticated and dangerous attacks fall heavily into this fileless category. To defend against them, you need an AI-based cyber strategy.

Machine Learning and Cloud

A domain within AI, machine learning, analyzes security-related data, including file “features” and behavioral indicators across massive data sets. Billions of events are utilized to “train” the system to detect unknown and never-before-seen attacks based on past behaviors. By training machine learning algorithms with data-rich sources and augmenting them with behavioral analytics, you can deliver next-generation defenses. Realistically, most companies don’t have the threat telemetry to train machine learning and that limits the effectiveness of the algorithms. But CDM, Einstein, and the Federal government certainly do.

Let’s examine that point further. The value that machine learning brings to the table largely depends on the data available to feed into it. Machine learning can’t create knowledge, it can only extract it. The scope and size of data are the critical elements impacting machine learning effectiveness. CDM can draw on data from across the federal government and also take a leadership role by establishing much needed public-private partnerships for data sharing and analytics.

Importantly, this is where cloud or elastic computing changes the game. Modern threats blend into the environment and only subtly differ from legitimate usage patterns. Detecting them requires looking at a larger amount of data and establishing contextual awareness. So it’s not so much securing the cloud as it is security through the cloud. You need both massive volumes of data and lightning speed analysis to stay one step ahead of today’s determined adversaries. That’s something only the cloud can deliver.

We’re all waiting on tenterhooks for CDM phases III and IV. The undead better bring their A game–Kevin “Jon Snow” Cox has some new weapons in his arsenal. It’s AI and cloud to the rescue.

Join MeriTalk and CrowdStrike on Jan. 18, 2018, from 4:30 to 7:30 p.m. at the W Hotel to network and dialog further on this proactive approach to more effective cybersecurity for public sector organizations. Click here for more information and to register.

]]> 0
Data Loss Prevention: Protecting Data Wherever It Resides Wed, 29 Nov 2017 21:09:43 +0000 Federal agencies have a data problem. Data that was traditionally inside four walls is now everywhere. Employees and vendors access it from all kinds of devices, located in all kinds of places, making it increasingly challenging for security teams to see what those users are doing with that data.

In a nutshell, agencies have lost control of their data, devices and users. We have seen the repercussions in the headlines. Contractors leaking classified reports, employees getting infected with malware, Harold Martin III. As we head into 2018, agencies must shift their approach. The traditional strategy of fortifying the perimeter is no longer effective. The strategy of the present and future must focus on protecting data wherever it is.

If you’re a security practitioner at a Federal agency, you most likely are all too familiar with Data Loss Prevention (DLP) technology.  It was known as the security solution for blocking the transmission of data outside the organization. Considering today’s data problem, DLP is going through a rebirth. Its capabilities are expanding so it can protect sensitive data both inside and outside the organization, while not overburdening limited analyst resources.

The expansion entails DLP being integrated with newer technologies such as Cloud Security Access Broker (CASB) tools to protect data not just on premise, but also in the cloud. Encryption is being added to the mix to protect the data while in transit. Tagging is another important technology to leverage DLP. The tool enables agencies to label documents (i.e. classified, not classified) to give their DLP technology hints as to what’s important and what’s not. For example, if a document is tagged “classified” then DLP knows to block or encrypt it. Multi-factor authentication is also important because it requires the user receiving the data to properly identify herself before the data can be opened.

One more technology that also integrates with DLP, and serves as the glue for tying the other tools together, is User and Entity Behavior Analytics (UEBA). UEBA technologies collect the telemetry data created from the tools mentioned above, identifies potential malicious and non-malicious insider and outsider activities, and delivers a prioritized list of the most critical incidents that DLP analysts must investigate immediately. The integration significantly reduces false positives because, whereas DLP focuses strictly on the data, UEBA determines whether the user that’s elevating the risk of the data being compromised is indeed a threat or business-justified.

For example, let’s say “Joe” from accounting was working on a lengthy project that required him sending a series of classified documents over an extended period of time to a third party contractor outside the agency. Every time Joe sent over the data, DLP would flag it and alert analysts, who would then waste their time investigating each alert and questioning Joe (interrupting his work and potentially lowering morale). And it would all be for nothing, an action that was business-justified.

UEBA would prevent this situation from happening. The technology would learn the first time that Joe’s actions were business-justified, and white label the event as business as usual so that analysts would never again receive an alert about Joe’s behavior.

On the flip side, if Joe was not given permission to send the information, UEBA would prioritize the alert based on the fact that the information was classified, and that Joe’s behavior was unusual compared to himself, his peers and overall team.

Under either circumstance DLP, integrated with UEBA and the other tools mentioned above, is protecting data well beyond the four walls of the agency. The technology has evolved to help analysts understand what’s truly sensitive in terms of data, which data does and does not need protecting, how data should be tagged, how it should be protected, whether it needs to be encrypted, who is handling it and what’s important to investigate. The goal being to protect agencies’ most sensitive data, enable collaboration, discover malicious behavior and prioritize investigation. That’s the DLP of today, tomorrow and beyond.

]]> 0
Automated Authorization: A New “Streaming” Service for Federal IT Mon, 27 Nov 2017 14:34:28 +0000 Before you say “no,” Federal cybersecurity professionals, hear us out. To borrow a line from Riggs in Lethal Weapon 2, “C’mon, say yes! Be original, everyone else says ‘no.’”

Even with all the discussion around efficiency and modernization, the typical Certification and Accreditation (C&A) process takes six months and costs more than $100,000 to complete. For larger IT systems, doubling those totals isn’t out of the norm. With a large number of organizations, this process is updated every six months and then redone every three years for each system. So, over a five-year-period, certifying and accrediting a system can cost more than $500,000. Wow, that’s a lot of money!

With an increasing attack surface resulting in millions of new threats every year, partially updating C&A documents every six months, re-mediating a few Plan of Action and Milestones, and updating all docs every three years, won’t, and doesn’t, keep the bad guys out of Federal networks.

This compliance process, designed and refined over the past 15 years, was sorely needed when conceived–and still remains the primary means of governing Federal IT systems. Mainframes, client server, and early three-tier architectures ruled the day, with an eventual light sprinkle of this new tech called virtualization. Cybersecurity was an afterthought, meaning build everything and bolt on cyber at the end of the process. Moving typical systems from procurement to implementation took years. Now a server, and even an application, can be provisioned in minutes and the first release can happen in one month. The times and technology have changed, yet updates and adoption have lagged significantly.

Depressed? Don’t be, there’s some help on the way. The latest Risk Management Framework was released for comment and some of the 800 NIST publications are also being revised. The Department of Homeland Security led Continuous Diagnostics and Mitigation program is rolling out across Federal agencies, providing an opportunity to increase visibility and analytics capabilities for applications and systems on their networks. And since we started writing this article, 20 new cyber companies have entered the U.S. market that will help better identify cyber threats, quantify, and qualify risks based on threats, vulnerabilities, and cost to mitigate. The greatest minds are collectively updating the guidance and the conversations grow in numbers…so why are we still not broadly considering the idea of automating security controls and authorizations?

From a technology perspective, the Lego pieces are in the box to get started building the Millennium Falcon. In terms of FISMA compliance, today’s Federal CIOs, CISOs, and Program Managers have access to more than 70 FedRamp Certified FedRamp providers, including a few with high controls. Early adopters of DevSecOps have worked alongside assurance professionals and shifted cybersecurity left in the process and embedded cyber into the DNA of secure automation workloads, from development to production. Automation at every level is possible and can be utilized to achieve assurance and reliability previously unavailable with human implemented processes.

Is it perfect? Nope. Nothing is. This is not about perfection, rather risk management and responsible evolution. The tools are in the toolbox.

So, how do agencies get started? How about a mission-critical mainframe? Ah, no. Well then how about a FIPS moderate back office system on a few virtual servers? Close, but not quite. Let’s instead start with a “net new,” moderate-level data system. Even better, if you can take advantage of the incoming Modernizing Government Technology (MGT) Act to actually rethink a business process/application, rather than carry the same less-than-optimized processes to a new environment and call it modernization.

Some of the criteria to qualify: this is a new project, not a bolted-on enhancement to an old system. Must be hosted in a FedRamp cloud provider. Automate as much as possible, including your security controls, in partnership with your security operational and policy professionals. The development environment should be provisioned using good DevSecOps best practices. Make sure you embed cyber hygiene and analytics at the lowest level of code possible. Lather, rinse, repeat for testing. Then once satisfied the app and cyber is implemented correctly, light up production. Repeat, the key to success is a collaborative and transparent partnership among all stakeholders that include operational and policy professionals…stakeholder engagement.

Beta was overtaken by VHS. VHS got smoked by Blockbuster. Blockbuster got rolled by RedBox. Netflix, Amazon, and Hulu, took out RedBox using mobile phones and broadband. Traditional Federal C&A process, meet Automated Authorization. So, for the traditional certification and accreditation process, “we are getting to old for this.” We couldn’t agree more Sgt. Murtaugh.

Rob Palmer is the executive vice president and CTO for ShorePoint, a privately held cybersecurity services firm serving both private and public-sector customers. Palmer is a former senior executive with the Department of Homeland Security (DHS) where he most recently held the position of deputy CTO and executive director for strategic technology management.
Keith Trippie is a retired DHS IT executive and entrepreneur. He is the founder of Shop4Clouds, digital marketing platform and urMuv, a neighborhood discovery app. He has also launched GotUrSix TV, a digital media platform to share the personal stories of active duty, veterans and military spouses.
]]> 5
Moving Beyond Cloud Security Fears Tue, 14 Nov 2017 15:33:29 +0000 The federal government has started to embrace the positive impact of cloud on cybersecurity efforts. We first saw this in the May Cybersecurity Executive Order, which outlined a shift to cloud as a key part of cyber security strategy. During a briefing, Tom Bossert, Homeland Security Advisor, said, “We’ve got to move to the cloud and try to protect ourselves instead of fracturing our security posture.” And, the new Report to the President on Federal IT Modernization, released by the White House Office of American Innovation likewise underlines the importance of cloud and a shift to shared services.

Agencies are gaining cloud deployment momentum now because they are matching the cloud to the mission – in many cases implementing highly secure on-premise or hybrid cloud solutions.

At a recent cloud event, John Hale, Chief of Application Services, Defense Information Systems Agency (DISA) said, “the direction that we’re getting from the senior members of the department is ‘move everything to the cloud now.’” He is working to create Cloud Access Points (CAPs) to protect Department of Defense (DoD) networks from the rest of the public cloud and enable DoD security requirement compliance. The recently awarded MilCloud 2.0 will be a hyperconverged on-premise cloud system that is expected to enable approximately 70 percent cost savings for DoD.

And, on September 20, the U.S. Air Force awarded Dell EMC, General Dynamics, and Microsoft a $1 billion, five-year contract to implement a Cloud Hosted Enterprise Services (CHES) program. This is the largest-ever cloud-based unified communications and collaboration contract in the federal space.

We have many examples of how federal agencies are finally moving past the “devil you know is better than the devil you don’t” mentality. But how does cloud, specifically hybrid and secure on-premise cloud, improve security? Agencies can:

  • Maintain control and compliance with security best practices
  • Align data protection services with application demands
  • Access IT services in the event of a disaster with active provisioning
  • Integrate existing security tools and services

FDR famously told us that the only thing to fear is fear itself. Digital transformation, powered by secure hybrid and on-premise cloud environments will modernize government services, improve governance and transparency, and keep federal data and systems more secure.

Learn more:

]]> 0
Federal IoT: Where Innovation and Cyber Risks Collide Mon, 30 Oct 2017 14:29:40 +0000 Gartner forecasts that by 2020, 20.4 billion devices will be connected across the Internet of Things (IoT). The IoT brings the promise of new possibilities, but to unlock them, agencies must change how they think about data and how to keep it secure.

There are four primary ways IoT can provide value to agencies and support innovation:

  • Driving operational efficiency – improve effectiveness while simultaneously reducing cost
  • Improving constituent experience – discover new ways to engage users
  • Mitigating risk – improve security by detecting failures before they happen
  • Driving mission success – discover new paths to mission success through data insights

Both Department of Defense (DoD) and civilian agencies are currently using connected technologies including drones, cameras, sensors, satellites, etc., to support their missions. For example, the Air Force is combining surveillance and flight sensor data to provide detailed threat information in real-time; the Navy is using a network of connected buoys with sonar capability to detect submarines more quickly and efficiently. On the civilian side, the General Services Administration (GSA) is using a network of low cost motion sensors to turn off the lights when employees are not at their desks – reducing environmental inefficiencies and overhead costs.

While the potential for innovation is great, federal IT teams face a number of challenges when implementing and using the IoT. For starters, up to 80% of IoT data will be unstructured, and these data points have to be stored, managed, and analyzed in a methodical way. For agencies, this means preparing their aging infrastructure for the influx of data from IoT devices on the edge.

The biggest challenge, however, is security. As agencies implement new layers of architecture and processors to harness the IoT, they must address cyber security concerns for both operational technology (OT) devices and traditional IT devices – not straightforward, as IT and OT have very different goals and drawbacks. And, there are an enormous variety of IoT devices that will come into play, each introducing a different level of risk.

It is important for federal agencies to be “paranoid, but not paralyzed” when it comes to IoT security. If approached in the right way – by having heavily encrypted storage environments and a cyber plan that provides for the protection of all endpoints/networks – agencies can effectively manage the security risks and take advantage of the significant opportunity ahead.

Learn more:


]]> 0
Rally the Troops: Building a Cyber Workforce for the Future Mon, 23 Oct 2017 14:11:57 +0000 Federal agencies face a continual struggle to attract top talent in the cyber workforce. Why? Because it is difficult for agencies to find qualified personnel, hard to retain security workers, and there is often an insufficient understanding of job requirements. This impacts us all – as it makes it more difficult for agencies to make good, risk-based decisions as they modernize federal IT and work to meet mission objectives.

The Presidential Executive Order on Strengthening the Cyber Security of Federal Networks and Critical Infrastructure required a group of agency leaders to jointly assess the federal government’s efforts to train the cybersecurity workforce, and develop a report to the President with their findings and recommendations.

These recommendations are not yet available, but federal agencies must find a way to provide potential employees with reasons to choose federal service over private sector perks. Representative Will Hurd (R-Texas) is trying to do just that with his proposed cybersecurity workforce program, the Cyber National Guard. The program would provide students a free cybersecurity education, and in return, students will work in federal service for an equal amount of time.

On C-SPAN, Rep. Hurd explained there is still a significant shortage of computing talent across the board – in Texas alone, 42,000 computing jobs went unfilled in 2015, with an average salary of $89,000. In that same year, Texas produced just 2,100 computer scientists. We need new approaches.

Rep. Hurd’s idea for a Cyber National Guard is that if you are going to college, and will study something related to cybersecurity, the federal government will find you a scholarship. In return, when you graduate, you will serve in government for the same amount of time you were in school. Rep. Hurd also explained that once you are finished with government service, the private sector will loan you back for a period of time – one weekend a month, ten days a quarter, or similar.  “This will improve the cross pollination of ideas between the public and private sectors,” Hurd states.

Once you have the troops in place, you have to give them the tools to be successful. President Trump put this component into motion when he elevated the U.S. Cyber Command to a full combatant command in August. Trump said in a statement that this will “streamline command and control of time-sensitive cyberspace operations by consolidating them under a single commander with authorities commensurate with the importance of such operations….and ensure that critical cyberspace operations are adequately funded.”  As to be expected, there are hurdles to moving these initiatives forward, including:

  • Funding for cybersecurity education.
  • Showing a clearly defined career path for cybersecurity growth within the federal government.
  • Having a standardized listing of cyber jobs within government to ensure students receive the proper credentials.

Fortunately, most agree across the aisle that improving the federal cybersecurity workforce is critical to our future. When asked about the path forward, Representative Hurd is optimistic, sharing, “There really is a group of us that are committed to a bipartisan solution to this problem. So, we’re going to have a posse.”

That’s good, because we know we are stronger together, and we will need continued collaboration and open dialog with federal agencies, our elected leaders, and between the public and private sectors to keep our information and systems secure.

]]> 0
Improved NIST Framework Supports Agency FITARA Goals Tue, 17 Oct 2017 15:14:26 +0000 With the release of the fourth FITARA scorecard, we saw agencies stall on progress – more agency grades declined than improved, and 15 agencies’ grades remained neutral.

One shining star was the United States Agency for International Development (USAID) – the first agency to ever receive an overall A. How did they do it? According to a USAID official, they focused hard on transparency and risk management – where they received an “A”. This was not the case for most of their counterparts however, as 14 agencies received a “C” or lower in that category.

Risk management is one of the more difficult areas for agencies to see success, but every CIO should be using the National Institute of Standards and Technology (NIST) Framework in that area. NIST recently released an updated version of the Framework for public comment, in the hopes that it would be easier to utilize and implement.

These were the most notable changes to the updated version:

  • Refined managing cyber supply chain risks – framework now has a common vocabulary so agencies working on cyber supply chain projects can clearly understand cybersecurity needs.
  • Revised “Identity Management and Access Control” category – framework now has clarified and expanded definitions of the terms authentication and authorization; added and defined the concept of “identity proofing”.
  • Introduced measurement methods for cybersecurity – framework now gives guidance on how to measure how well an agency is reducing risk and identifies overall agency benefits.

NIST has been gathering feedback on the Framework changes, and is expected to release the final version this fall. Hopefully, federal CIOs can use the updated Framework to effect positive change on their cybersecurity and risk management projects – and in turn, see an upward tick in grades when the next scorecard is released in December.

Learn more about Dell EMC’s portfolio of cybersecurity capabilities for government:

]]> 0
The CDM Marathon: How Feds are Keeping Pace Wed, 11 Oct 2017 14:36:08 +0000 While the Cybersecurity Sprint focused attention on how to generate improvements quickly, one of our most important cyber efforts – the Department of Homeland Security (DHS) Continuous Diagnostics and Mitigation (CDM) program – is unquestionably a marathon. Now in its fourth year, the program is maturing agencies’ abilities to identify cyber risks and adopt a risk-based approached to mitigation.

The program is entering Phase 3, but agency progress has been staggered. Every agency started from a different point of cybersecurity maturity, so this is not surprising.

Phase 1 involved mapping networks to determine what they would need to improve threat protection; Phase 2 on identifying who has access to the network and how you handle access management. Up next, Phase 3 focuses on boundary protection and incident response.

What was initially surprising was the degree to which agencies discovered during Phase 1 that they were underreporting device numbers. James Quinn, lead systems engineer on the CDM Program at DHS, said that DHS estimated federal agencies would map approximately two million assets, but agencies ended up finding approximately four million.

We have to anticipate this challenge will continue to grow with the Internet of Things (IoT). Every internet-connected device is a potential vulnerability, so improving asset management and establishing a secure supply chain are critical to securing federal systems and information.

When we think about supply chain risk management, we think about our devices and the systems and software we use to protect them. The CDM Project Management Office (PMO) requires vendors submitting products for the CDM Approved Product List (APL) to provide details on their supply chain risk management policies. See more: CDM Supply Chain Risk Management plan.

RSA Archer, a Dell Technologies company, serves as the platform for the agency and federal dashboards. At the agency level, the dashboard captures data locally from network sensors, scores the data, and shows “worst problems first” for operators – i.e. enables a risk-based approach. Agencies are in the process of deploying their dashboards, and the federal dashboard is scheduled to deploy this year.

As agencies and the CDM Project Management Office move forward, they will be tackling ongoing challenges including the need for acquisition flexibility, how to speed the acquisition process, how to integrate FedRAMP, and what’s next for Trusted Internet Connections (TIC).

That’s an already tall order that will continue to grow – and we do win or lose together in this race.

]]> 0
LPTA is Hurting Employee Relations for Small Businesses Fri, 06 Oct 2017 12:59:03 +0000 Introduced in June and passed by voice vote with no dissent, HR 3019, the Promoting Value Based Procurement Act of 2017, acknowledges that the Lowest Price Technically Acceptable (LPTA) approach has not produced strong results on contracts. The legislation forces agencies to justify an LPTA approach by “comprehensively and clearly describing the minimum requirements expressed in terms of performance objectives, measures, and standards that will be used to determine the acceptability of offers.” Agencies must exert more control over reporting and standards on LPTA contracts to justify the cost relative to the quality of the product.

This is a welcome step considering the pressure LPTA puts on contractors more interested in quality work with consistently satisfied customers than volumes of one-hit, transactional contracts. LPTA disincentivizes businesses, especially small businesses that lack capital, to be good, responsible employers. Trickle-down effects of LPTA are easy to trace, as unhappy employees perform unhappily, lowering the quality of their work products and the performance of their employers and customers.

The largest expenses, for any employer, are in employee salaries and benefits. When a contractor is competing on an LPTA basis, they must keep these costs as low as possible to remain competitive, leading to a scenario where skilled employees are underpaid and unhappy, or unskilled employees are fairly paid but cannot perform to a high standard. In the former case, according to an Employee Job Satisfaction and Engagement survey conducted by the Society for Human Resource Management (SHRM), compensation and benefits are among the top five measures of job satisfaction. Another study conducted by Harris Poll on behalf of Glassdoor found that 57 percent of respondents said benefits and perks are among their top considerations before accepting a job.

While the perks of a job may sound superfluous, when considering the cost of a disengaged employee it becomes clear benefits are worth the investment. For example, disengaged employees are often poor collaborators, lack enthusiasm for projects, miss deadlines, and don’t take initiative. For companies that do provide multiple benefits, such as wellness benefits, SHRM found that 40 percent of companies saw decreases in unplanned absences, and 33 percent cited a direct increase in productivity.

The absence of good pay and benefits also increases the risk of turnover–a major issue for government agencies that rely on contractors and institutional knowledge to be successful. When employees leave, contractors spend money to replace them. SHRM found that the total costs of replacing an employee can range from 90 percent to 200 percent of their annual income, depending on several factors. Contractors recoup that cost by raising their rates to the government over the long term. This leads to excessive costs for the government in the short term (due to lost productivity and knowledge loss) and long term (due to the higher rates). LPTA exacerbates this issue, leaving government contracts in short-term turmoil and actually raising costs over the long term.

HR 3019’s requirement that agencies justify an LPTA strategy is a good first step, but more action can be taken. Reliance on LPTA inevitably means that contracting firms will drive down prices in order to remain competitive, which will hurt their employees and the government. If the focus is changed from procuring services at a low price to procuring services at a fair, market-based price, the quality of the work the government receives will inevitably increase. While LPTA is well-intentioned–what better way to save taxpayer money than by driving prices down through a simple calculation and competitive procurement?–in practice it has proven to hold less value than strategies focused on performance balanced with cost.

The time has come to move on from LPTA and embrace other, smarter procurement strategies.

]]> 0