Smarter Gov Tech, Stronger MerITocracy
This page is not built out yet. If you are seeing this page, please contact an administrator.

Why Congress Needs to Take a Long, Hard Look at 18F

How many Presidential Innovation Fellows does it take to type 18F?

That may not sound like a serious question, but I can guarantee the anger of taxpayers and lawmakers will be real when they learn that it took a small team of designers from the General Services Administration’s 18F an entire weekend to come up with the following change to the digital service team’s logo.

18F logosIt gets worse. This so-called weekend “brandathon,” as it came to be known, began with an organization-wide workshop on 18F’s core values. “After this workshop, several design studios, and hours of work, the branding team had an initial set of deliverables to share with the rest of 18F,” wrote 18F designers Kate Garklavs and Jennifer Thibault in a mind-numbing 826-word blog posted Thursday to the 18F website.

This is what 18F calls “agile branding.” The weekend design “brandathon” took place last August—nine months ago.

Eric Ronne, one of 18F’s digital designers who participated in the summer work session, summed up the experience as follows: “At 18F we’re always changing and improving government interactions for our users. We iterate constantly here, and now we’ve iterated on our logo, too,” he said. “Our goal was to refresh the mark while nodding to the past, to create a straightforward update that’s accessible, bold, modern, and flexible.”

As if spending a weekend of design hours and nine months of internal back-and-forth discussions wasn’t enough to come up with this epic feat of Photoshop 101, the well-compensated digital branding innovators at 18F also spent time creating a collection of images featuring the new logo and inspirational messaging, “the optimism of which is central to our brand,” according to the blog post.

18F sky photo2“These images, which team members use as desktop art, weren’t exactly the highest priority ‘need’ item, but they were a fast way to show the team how the new system could begin to flex in more exciting ways than just templates,” wrote Garklavs and Thibault.

And this epic waste of tax dollars isn’t over. “We plan to create infographic templates for our social media accounts. And eventually, we’ll restructure and restyle our website, another outfit that we’ve outgrown since we started in March 2014,” they wrote.

I’m all for digital services in government, and for improving government services through technology and innovation. But the amount of time, effort, and money that GSA dedicated to changing a font in Photoshop is an obscene misappropriation of government resources. This type of maddening waste is exactly why so many observers in and out of government have come to question the purpose, mission, and value of 18F.

The innovators at 18F would be wise to “iterate” on something more difficult and of more importance to the American people. If they don’t start doing that soon, then the days of the 18F experiment in government are surely numbered.

The Situation Report: Silicon Valley Disconnect

Valley Girl

Gag me with a spoon, Erie Meyer. America’s self-described “foremost technologist named after a great lake…working on service delivery” at the U.S. Digital Service recently described in a Twitter post what her former career was really all about.

Although there’s no serious Valley experience detectable in Meyer’s official bio, this issue is actually a very serious concern among those who have been critical of the new digital service stars recruited from Silicon Valley to help save government from its Luddite past. As the argument goes, Todd Park—the digital service recruiter-in-chief—has staffed the government’s digital service teams with a bunch of wide-eyed techies who have been unable to divorce themselves from their former Silicon Valley mission of designing click-bait, or as GitHub Evangelist Ben Balter put it, working to “implement the button for those clicks.”

So far, “implementing buttons” seems to be right in the USDS sweet spot. Consider their accomplishments: A College Scorecard website for the Education Department; a “revamped” online presence for the Federal Election Commission; and a Web template designer for static government websites. There have been others, but nothing revolutionary.

This is a big problem for the USDS at the White House and its sister organization, the General Services Administration’s 18F. Both organizations should be bracing for a major slap in the face from the Government Accountability Office, which is planning to release an audit of their progress to date. And The Situation Report has picked up strong signals that these particular digital audit pages will make a loud thud when they fall on the desk of U.S. Chief Information Officer Tony Scott at the Office of Management and Budget.

Deep Throat Meets Demosthenes

On May 16, MeriTalk will begin publishing The Federal IT Papers—An Exclusive Insider Account of IT Decision-making Gone Wrong. This weekly series of stories is based on a book-length work by a current senior Federal IT official who goes by the pseudonym Demosthenes—a great orator in Ancient Greece, and the name chosen by Valentine Wiggin in Ender’s Game. Our author insisted on anonymity due to a fear of reprisals.

PapersIn our first installment, Demosthenes takes issue with the government’s infatuation with the U.S. Digital Service and the way in which career Federal IT workers are treated.

“The problem with the USDS is that they make all these promises and have cachet. They are in the West Wing after all,” the author wrote. “When they recruit people, the mind-set coming in is that everyone who was working in Federal IT before I got here is shit. As a result they get rid of everyone who knows anything.”

In one case, a USDS specialist recruited from Silicon Valley by an agency CIO went so far as to undermine the CIO who hired them. The result? According to Demosthenes, the agency head decided it was time to get a new CIO.

“And now I see them embedding themselves in preparation for this next administration, and I can’t sit idly by and not speak up. Somebody has to pierce this bubble and say, ‘hey guys, thanks for importing a whole bunch of really smart people who don’t know the first thing about how to get shit done here. Thanks for bringing in a bunch of people who don’t value my contributions and don’t want to hear about what has worked and what hasn’t.’ ”

You can read more of The Federal IT Papers starting May 16.

Beyond Escape Velocity

Let’s make it three and go for the hat trick on digital services. My left coast listening post reports that it’s no coincidence that GSA’s latest innovation partnership in San Francisco will be located right next door to the West Coast 18F. My Market Street surveillance station has picked up strong signals that the so-called Superpublic Innovation Lab is part of GSA’s master plan to extend 18F’s presence both physically and politically—reaching a hiring velocity that makes it virtually impossible for the next administration to escape the pull of government digital services.

GAO Visits VA

My Vermont Avenue surveillance station in Washington, D.C., has noticed unusual Government Accountability Office activity at the headquarters of the Department of Veterans Affairs. Sources tell The Situation Report that auditors are interested in VA’s progress on identity theft prevention. In particular, investigators want to know how far VA has come in eliminating the use of Social Security numbers as a key identification number for veterans.

The Weekend Reader – May 6

Cybersecurity Goals to Guide Federal Software Spending

ecommercetimes.comEvolving requirements to greatly improve Federal protection of information technology resources will shape Federal software spending. In fact, Federal cyberprotection goals should be augmented and significantly modified, according to recent studies of the Federal market. The linkage between increased Federal investing in cybersecurity and the requirements for bolstering IT protection are portrayed in two newly released reports.

U.S. Chief Data Scientist: Entrepreneurs Should do a Tour of Duty in Government

venturebeat.comThere’s no question that the U.S. government has collected an incredible amount of data. Whether for things like the Census, housing, agriculture, transportation, or health care, Federal agencies have accumulated data from around the country. In the past seven years, the White House has made efforts to leverage more technology at the Federal level.

 

 

Microsoft’s CEO Explains Why His Company Sued the U.S. Government

cio.comMicrosoft surprised the world last month when it filed a lawsuit against the U.S. Department of Justice, alleging that the frequent practice of attaching gag orders to search warrants for customer data violates the U.S. Constitution.
On Monday, CEO Satya Nadella told a group of tech luminaries why the company did so: Microsoft has a strong view on its privacy promises to users, and the company will fight to prevent government overreach that, in its view, compromises the principles of privacy.

Why Open and Frugal Should Be the Default for Government IT

techwire.netWith public-sector information-technology projects at any level of government, one does not have to look too far to find examples of waste and worse. In the wake of a series of failed projects, Hawaii is auditing its last four years of IT spending. On the local-government level, it would be hard to find a better example of what can go wrong than New York City’s CityTime payroll-system project, abandoned after its costs ballooned from $63 million to $700 million amid mismanagement and outright corruption.

 

Tech Companies are Unlikely to Oppose Government Demands on Data Access

firstpost.comCan other technology companies defy the government the way Apple did when asked to help U.S. investigators crack the code of iPhone 5C? Unlikely. Especially in jurisdictions where the governments may not be so benign in pursuing hidden material in electronic devices or data centers. Not EMC Corporation, the world’s largest data storage multinational.

 

 

The Situation Report: FirstNet Worries and VA’s Leadership Rumors

Public Safety Communications

Remember that $7 billion Congress allocated to help establish a nationwide public safety broadband network, known as FirstNet? Well, when was the last time 50 states agreed on anything?

My Reston, Va., listening post has picked up increasing concerns about FirstNet’s decision to adopt a centralized, controlled network. As any informed observer can imagine, each state’s emergency communications capabilities are based on separate networks and use vastly different technologies—although many still rely on Land Mobile Radios and, according to insider reports, have no plans to completely ditch their own autonomous LMR networks. And that’s a problem for FirstNet.

FirstNet has taken the position that state autonomy in network design decisions and management will jeopardize FirstNet’s ability to provide a network that meets its coverage and service goals.

“The governance model chosen by FirstNet is a federalized, centrally planned and directed network, bolstered by federal procurement practices that limit states to a consultative role,” according to a congressional analysis intercepted by The Situation Report. “A risk in choosing this model is that states may consider the federal presence excessive and cease to cooperate with FirstNet, jeopardizing the purpose of the network.”

But May is the month of proposals for FirstNet. Potential bidders will first be sending FirstNet officials capability statements followed by full proposals to build the network.

However, my mobile listening post outside of 445 12th Street SW, Washington, D.C., has picked up strong signals that the FCC is working hard on opt-out plans.

“As I have suggested to the board a number of times with all due respect, FirstNet isn’t the foremost thing on the minds of many governors when dealing with fiscal crises and natural disasters and things like that,” said former Vermont governor James Douglas during the last FirstNet board meeting in March.

VA’s New CISO?
situation report logoMy listening post on Vermont Avenue in Washington, D.C., tuned into the rumor mill this week and discovered some interesting, yet unconfirmed, reports of a new deputy assistant secretary for cybersecurity at VA.

Talk at VA is that Dominic Cussatt, the deputy director for cybersecurity policy in Terry Holvorsen’s office at the Defense Department, is packing his bags to take the cybersecurity gig at VA. Cussatt would replace Ron Thompson, who’s been serving as interim CISO since Veterans Affairs CIO LaVerne Council ordered Brian Burns to “redirect his exclusive focus on VA’s role in the Interagency Program Office (IPO).”

Truth or Dare?

My remote sensing system on Capitol Hill has picked up chatter that some “analysts” on the Hill are daring lawmakers to do away with traditional polygraph examinations for security clearance holders in favor of the intelligence community’s new Continuous Evaluation program, which relies on constant monitoring of social media and other forms of big data analytics.

If Congress—which isn’t subject to a polygraph examination—does consider eliminating the polygraph, the result could be new business for the IT industry.

“What emerging technologies show the most promise in providing more objective measures to detect lying?” asked the congressional researcher. “Should additional resources be directed toward encouraging such technologies?”

That said, the result could also be a boom in business for foreign intelligence services.

Intercept a situation report? Email me at dverton@meritalk.com or DM via Twitter

The Great Cloud Debate: Public vs. Private – and the ViON Hybrid Model

Today, just 13 percent of Feds say they can deploy new systems as quickly as required. And, it’s no secret agencies are being pushed to make the move to cloud – whether it’s data center consolidation initiatives, flexible performance to meet constituents’ on-demand requests,  simply the desire to increase overall IT efficiency, or the demands of the current regulatory environment.

So where do agencies start as they evaluate their cloud options?  Enter the Great Cloud Debate.

For most, choosing the right cloud is more than a black-and-white decision.  In fact, despite cloud growth goals and Fed initiatives, cloud adoption is currently only around 4% of federal workloads.  Concerns about contract lock in, application readiness, data security, data custody, and legacy buying processes are commonly cited as barriers to cloud adoption.

Before making a cloud decision, agencies need to start with the basics – and understand why they want or need cloud. From accelerating the move to an as-a-Service (aaS) consumption model to maximizing budgets to enhancing faster time-to-value, agencies need to define what their factors are for critical mission success and frame their move to cloud into that context.

When it comes down to public vs. private cloud, direct comparison is nearly impossible. The choice boils down to which cloud is the right answer for the agency’s unique mission, challenges, and workload requirements.  Issues to consider include performance requirements, availability, security, data custody and control, access patterns, support requirements, available internal skill-sets, speed of deployment requirements, and scalability.

Private cloud is often the best cloud option for Federal agencies. Seventy-five percent of Feds want to migrate more services to cloud, but are concerned about retaining control over their data. Further, agencies estimate 32 percent of their data can’t be moved to cloud due to security or data sovereignty issues.

Private cloud offers security, accelerated performance, availability – and importantly, but often overlooked – can be a means to eliminate re-architecting applications to fit into a cloud business model.  Agencies also retain full control of their data. Public cloud promises accelerated deployment, maximum scalability, and simplified acquisition.

That said why not have the best of both worlds?

Hybrid cloud models address agencies’ main concern for data security, availability, and control, and still enable high performance and flexibility. By placing data in a private, hosted environment and connecting just the compute side to a public cloud ensures data is secure – putting agencies’ nerves at ease. But, the agency retains control of the data and never, for example, needs to pay a fee to remove it from the cloud environment.

ViON’s hybrid cloud model enables agencies to experience the benefits of public and private cloud, addressing key concerns of agencies as they approach the cloud procurement process – security, availability, scalability, and cost.

Build Your Cloud, Your Way.

We give agencies a system with a floor and ceiling customized by their needs and budget.

IT teams can select best-of-breed infrastructure elements, customized to deliver the availability and performance required – vs. “take it or leave it” public cloud configurations.

And, they can add capacity – scaling up or down as required to meet constituents’ on-demand needs, only pay for what they use, and retain full control over their data.  That’s the real silver lining.

This blog post was originally published here.

 

Learn More:
Business of Cloud eBook
ViON Agile Cloud Solution Portfolio

Look Who’s MeriTalking: Virginia CIO Nelson P. Moe

The Commonwealth of Virginia has made important investments in IT and cybersecurity in its fiscal 2017 budget. Those investments will support a doubling of the staff supporting Virginia’s chief information security officer, the building of a cybersecurity range, as well as the establishment of a cybersecurity fusion center and a cybersecurity services bureau.

“The beauty of Virginia is that it’s a consolidated state–all the agencies come through us,” said Virginia Chief Information Officer Nelson P. Moe. “So from a cyber, governance, projects, and procurement, we get a chance to have a common agenda focus.”

The Weekend Reader – April 29

Sens. Ron Johnson, Tom Carper Ask OMBs Shaun Donovan for Federal Data Security Guidance Revision

executivegov.comUnder the appendix, Federal agencies are required to subject security controls for major applications and support systems to audits at least every three years. “While some documentation of security controls is essential, these three-year assessments are not cost-effective or consistent with best-practices or other Federal policies,” the lawmakers said. Carper and Johnson requested OMB to submit its response to the Senate committee within 30 days.

 

U.S. CIO Hints Federal Adoption of ‘Bimodal IT’ to Balance Old and New Tech

scmagazine.comU.S. Chief Information Officer Tony Scott Tuesday hinted his office may be working to help guide Federal agencies to adopt “bimodal IT” to balance modern IT with old but necessary systems.
Gartner defines the practice as “managing of two separate, coherent modes of IT delivery, one focused on stability and the other on agility.”

 

New Federal Cybersecurity Rules Moving Too Slow, Senators Say

ciodive.comThe senators say the lack of a new policy is preventing Federal agencies from moving to automated systems that can better protect Federal networks from cybersecurity threats. The existing Federal cybersecurity policy was created in 2000 and the threat landscape has evolved significantly since then.

 

 

 

IT Execs Join Federal Cybersecurity Panel

ecommercetimes.comOne of the most recent developments was the formation of a Federal Commission on Enhancing National Cybersecurity. Another was the formal introduction in Congress of the administration’s information technology investment plan, which is heavily tilted toward cybersecurity protection. The goal of the panel is to make recommend actions that can be taken over the next decade to enhance cybersecurity awareness and protections throughout government and the private sector, according to a White House statement.

Facebook Transparency Report Shows Increase in Government Data Requests, Most With Gagging Orders

betanews.comFacebook has published its latest Global Government Requests Report covering the second half of 2015. The transparency report reveals that there has been a 13 percent increase in the number of government requests for data, but it also shows that Facebook is still not able to be as transparent as it might want. For the first time the social network is able to report about the number of data requests that have a non-disclosure order attached to them.

 

The Situation Report: NIST Framework Mandatory? Open Source Rebellion at DHS?

The New De Facto Voluntary Standard

My Gaithersburg, Md., listening post has picked up strong signals that there’s a new mandatory cybersecurity standard in town. The Framework for Improving Critical Infrastructure Cybersecurity, developed in 2014 by the National Institute of Standards and Technology in close cooperation with industry, has always been a voluntary guide for organizations of all types and sizes to apply common best practices in risk management.

But a low-key change has taken place that sources say has shifted the NIST CSF from a purely voluntary practice to a mandatory standard for Federal agencies. For the first time, the government has linked the Federal Information Security Modernization Act metrics to the CSF. In fact, the fiscal 2016 FISMA metrics leverage the NIST CSF as a standard for managing and reducing risk, and are organized around the CSF’s five major functions of identify, protect, detect, respond, and recover.

“Since they tied the FISMA reporting metrics in 2016 to the Cybersecurity Framework, guess what? It’s now a de facto standard,” a source close to the development of the CSF since the beginning told The Situation Report. “You have to use it. It’s voluntary, but there’s no way of getting around it and still being compliant with FISMA.”

What’s In a Name?

Well, if you’re the focal point of the Federal government’s effort to get all of government and the private sector speaking the same language when it comes to cybersecurity then names matter a lot. Officials at NIST are seriously considering dropping “Critical Infrastructure” from the title of the Framework for Improving Critical Infrastructure Cybersecurity in an effort to boost adoption across a broader swath of industry. The framework is being credited with significantly helping to raise the bar in security across industries, but some in the private sector apparently have questioned whether a critical infrastructure guide applies to them.

situation report logoThe Open Source Battle at DHS

The Department of Homeland Security’s chief information officer Luke McCormack was put in a tough position recently when he had to publicly flip-flop on the department’s official position on the use of open source software.

McCormack was forced to post to GitHub a strong formal endorsement of a draft White House policy for publishing Federal source code in the open. “We believe moving towards Government-wide reuse of custom-developed code and releasing Federally-funded custom code as open source software has significant financial, technical, and cybersecurity benefits and will better enable DHS to meet our mission of securing the nation from the many threats we face,” McCormack wrote, reversing the concerns expressed a week earlier by members of his own team.

Those DHS IT officials had called out the misguided geeks at the White House noting that most security companies do not publish their source code because that would allow hackers to develop highly targeted attacks.

“Government-specific examples: citizenship anti-fraud rules that are coded into software, identification of special codes used to flag law enforcement actions, APT threat indicator scripts, Mafia having a copy of all FBI system code, terrorist with access to air traffic control software, etc. How will this be prevented?” a DHS IT official stated.

And what about protecting taxpayers’ interest in government-developed software? That’s right, some at DHS would like to know how the White House will prevent commercial entities from using taxpayer-funded software components in commercial systems that companies then sell back to the government.

McCormack may have caved to White House pressure, but The Situation Report has picked up on a rear guard action to stop the White House open source push–an effort that puts the nation’s security at risk through a deliberate decision to ignore the security issues surrounding software provenance and the threat of inheriting version-specific vulnerabilities in open source code.

“Given that national security systems are exempted from this policy, and virtually all DHS systems are deemed mission/business essential, any release of code is potentially exploitable,” DHS IT officials wrote. “To avoid having our in-house developed code becoming open source, we will have to either get the DHS CIO approval to the exceptions or declare our in-house developed systems to be National Security Systems and take them off digital rebellionthe Sensitive But Unclassified (SBU) blue line and put them on classified networks, thus increasing our costs of operation and support.”

Digital Service Rebellion

My forward observers report signs of a massive rebellion against the U.S. Digital Service by the career Federal IT employees who are being blacklisted, maligned, and generally pushed aside for not being “from the Valley.” MeriTalk plans to bring you an exclusive look at this insurgency—penned by a current Federal IT insider who believes the Obama administration has gone too far in its attempt to import change from Silicon Valley. Stay tuned.

Why Are We Letting Our IT Infrastructure Fall to Pieces?

Why are we letting our IT infrastructure fall to pieces?

Former congressman and Secretary of Transportation Ray LaHood recently asked this question (without the insertion of “IT”) about the nation’s aging infrastructure.

Alan Balutis
Alan P. Balutis (Photo: MeriTalk)

LaHood’s article focused on an oft-discussed topic–the nation’s crumbling physical infrastructure. He notes that in the past 12 months, broken dams in South Carolina caused flooding and fatalities, a massive gas leak in Los Angeles sickened and displaced thousands of families, and residents of Flint, Mich., found unsafe lead levels in their drinking water. In Washington, D.C., the region’s Metrorail system might be facing line closures to make long-neglected safety repairs.

Our nation’s refusal to perform critical maintenance, to invest in our public infrastructure, and to take care of our roads, rails, bridges, and pipelines has been widely discussed and the costs well documented. As Rosabeth Moss Kanter, a Harvard business professor and the author of “Move,” a recent book on the subject, said in The New Yorker recently, “Infrastructure is such a dull word. But it’s really an issue that touches almost everything.”

Only recently, though, has a similar situation–the government’s reliance an outdated technology–surfaced as an issue. Federal Chief Information Officer Tony Scott has called it a “crisis” to rival the Y2K computer glitch.

Dave Powner, Director of IT Management Issues at the Government Accountability Office (GAO), noted that some agencies are running tens of millions of lines of long-deprecated software code, such as COBOL and assembly languages. Less frequently mentioned is the aging infrastructure itself–switches, routers, servers, desktops, mainframes, etc. Recent research has suggested that a substantial portion of the government’s IT hardware has already reached LDoS (Last Day of Support), which means it is not receiving updates, security alerts or patches, and so on. In the next two years, an ever greater portion of that infrastructure will reach that same stage.

Along with increased security risks and vulnerability to cyberattacks, these outmoded systems can’t support growing demands for greater mobility, collaboration, data analytics, etc. Finally, they are also at higher risk of simply breaking down. Consider what a catastrophic blow that could be to the business of government–tax collection, benefit payments for veterans, monthly checks for Social Security recipients, air traffic control, and so on. Recent reports note that the Coast Guard is “overwhelmed” by the daunting task of updating its legacy IT infrastructure. That could be a matter of when, not if, GAO’s Powner says.

While we are at it, we should also recognize the need to modernize the processes by which government buys and operates its IT infrastructure, which is a major part of the reason why it’s been so hard to modernize. But more about that in another column.

Former White House chief of staff and now Chicago Mayor Rahm Emanuel a few years ago pronounced a rule that now bears his name: “Never let a serious crisis go to waste.”

To quote Kanter once again and note its applicability to the government’s aging IT infrastructure: “This is the heart of our problem: infrastructure policy has become a matter of lurching from crisis to crisis, solving problems after the fact rather than preventing them from happening.  We’ve turned into short-term-fix addicts.”

The president’s legislative proposal to establish an Information Technology Modernization Fund would support the transition to a more secure, efficient, and modern IT infrastructure. It deserves support from all of us.

 

Alan P. Balutis is Senior Director and Distinguished Fellow, U.S. Public Sector, at Cisco Systems.

The Weekend Reader – April 22

Deltek: 2017 Federal Budget Presents Opportunity for Cyber, R&D, Health IT Vendors

executivebiz.comDeltek estimates contractor-addressable spending on the U.S. government’s mission-critical programs will increase by $18 billion to about $682 billion in fiscal year 2017 if Congress approves the White House’s latest budget request. The report forecasts continued growth in the Federal cybersecurity, big data analytics, health care information technology and infrastructure segments despite a projected small decline in overall contractor-addressable IT spending for FY 2017. “Government demand looks particularly strong for…areas that align with the Obama administration’s focus on modernization, health care and veterans services,” said Deniece Peterson, Deltek’s director of Federal industry analysis.

F.B.I. Director Suggests Bill for iPhone Hacking Was $1.3 Million

nytimes.comThe director of the FBI suggested Thursday that his agency paid at least $1.3 million to an undisclosed group to help hack into the encrypted iPhone used by an attacker in the mass shooting in San Bernardino, Calif.
At a technology conference in London, a moderator asked James B. Comey Jr., the FBI chief, how much bureau officials had to pay the undisclosed outside group to demonstrate how to bypass the phone’s encryption.

 

Apples Transparency Report Reveals Compliance with Government Data Requests

govtech.comApple says these requests typically seek information about a user’s iTunes or iCloud account, and each requires a search warrant. That information could then be used to help investigators prevent planned crimes from taking place or, after the fact, assembling a criminal case against someone. Privacy advocates are alarmed by the growing number of these personal-data requests.

 

Why Cyber Is a Boardroom Issue

bgov.comCybersecurity is no longer the exclusive domain of corporate IT shops. In the past and in some quarters today, cybersecurity is still viewed as “some IT thing.” But the companies that take this view do so at their own peril. The specter of data breaches and denial-of service attacks are risks facing every business using an Internet connection.

 

 

Pentagon CIO: U.K. a Model on Cloud Adoption

fcw.comMicrosoft announced last November that the company would begin offering cloud services from the United Kingdom, with the firm saying those services would extend to government organizations. Department of Defense CIO Terry Halvorsen has evangelized for the Pentagon to be more willing to allow cloud vendors to host sensitive DOD data. He would like about 50 DOD personnel to do a stint in the private sector in the coming year, and likewise bring about 50 IT hands from industry to the Pentagon.

 

The Situation Report: VA’s Never-Ending IT Shuffle, and a Bad Start for InfoSec Week

VA CISO Watch

The Situation Report has learned that Department of Veterans Affairs CIO LaVerne Council has ordered VA CISO Brian Burns to “redirect his exclusive focus on VA’s role in the Interagency Program Office (IPO).”

“To meet our goal, we must have a dedicated, focused leader for interoperability,” Council wrote Wednesday in an email to staff obtained by The Situation Report. The agency certified interoperability with the Defense Department on April 8 in accordance with the requirements spelled out in the 2014 National Defense Authorization Act. “Brian’s prior work in the IPO combined with his extensive experience in clinical and health technology reaffirm that he can provide that focus and help guide our efforts beyond the certification, beyond VistA 4, and provide a framework for Veterans today and in the situation report logofuture.”

Council has also tapped Ron Thompson, the former executive director of IT infrastructure and operations for the Department of Health and Human Services who late last year became Council’s Principal Deputy Assistant Secretary, to serve as interim VA CISO.

“To ensure continuity in our information security program, Ron will serve as the interim Chief Information Security Officer (CISO), giving us the opportunity to renew our search for a permanent, long-term CISO,” Council wrote. “The tenet of fully resourcing our cybersecurity efforts must be consistent–our Office of Information Security must have a singularly focused leader.”

Off to a Bad Start

VA kicked off its 2016 Information Security and Privacy Awareness Week (ISPAW) Speaker Series on Monday, but a stellar event it was not. Multiple human sources debriefed the Situation Report on the event, which took place via online chat and telephone dial-in. The most glaring problem with what seems like an important initiative for an agency that has been constantly dogged by security lapses was the absence of LaVerne Council. Although scheduled to provide the keynote, Council canceled her appearance at the last minute for unknown reasons. Tina Burnette, executive director of the Field Security Service, filled in for Council.

The theme for the week, according to Burnette, is enterprise cyber strategy.

The Situation Report analyzed multiple reports from the call and discovered that only about 100 VA employees joined the session. Only four VA employees were brave enough to ask questions, even though many of the agency’s information security leadership was available to answer questions. One question, however, was particularly instructive: “Where does the process of information security start?” a VA employee asked.

A speaker identified as Randy Ledsome (unconfirmed), VA’s director of Field Security Service, tried to answer the question, but somebody had put their call on hold and the hold music temporarily interrupted the call. Once that was cleared up, Jackson made an attempt at an answer. “I think this gentleman had a very complex question,” Jackson said. “It starts with having a program. One of the things we’ve done for the [Information Security Officers] we’ve put together what we call the ISO Reference Guide, and one of the things we laid out in there was a problematic—a programmatic—approach to dealing with our programs.”

The question-and-answer portion of the call went on for another 30 minutes, ending with a long, awkward interruption by a Spanish speaker who did not have his phone on mute.

Look Who’s MeriTalking: USDA’s Flip Anderson

Flip Anderson is the Executive Director of FITARA Operations at the U.S. Department of Agriculture.

MeriTalk caught up with Anderson at this year’s FITARA Forum and Data Center Exchange Brainstorm in Washington, D.C., on March 30, for this special video edition of Look Who’s MeriTalking.

Watch Now

The Weekend Reader – April 15

Federal Data Should Be Open To Public, Lawmakers Say

lawyerherald.comData collected by the Federal government should be open and accessible to public by default, according to a group of lawmakers. According to Sen. Brian Schatz, D-Hawaii, at a discussion in Washington on Thursday, agencies these days just release data using virtual documents wherein the searcher of the information is expected to do their own digging. One of the Federal agencies that will be covered by this bill will include the Education Department.

 

Microsoft Sues the U.S. Federal Government Over the Right to Reveal Data Requests

windowscentral.comMicrosoft wants to reveal more information on the data requests it gets from the U.S. Federal government. The company filed a lawsuit claiming the government has violated the First and Fourth Amendments by ordering Microsoft to keep thousands of data requests to the company secret. Notably and even surprisingly, 1,752 of these secrecy orders, or 68 percent of the total, contained no fixed end date at all.

 

 

‘Cloud-First’ To Close 5,000 Federal Data Centers By 2019

informationweek.comIn 2010, the Obama administration’s first Federal CIO Vivek Kundra mandated that Federal agencies should try to make use of a “cloud-first” strategy instead of building more data centers. Since then, 3,125 Federal agency data centers have been closed, out of the 10,584 that existed when Kundra made the announcement.

 

 

 

Burr-Feinstein Encryption Bill is Officially Here in All Its Scary Glory

techcrunch.comSens. Richard Burr and Dianne Feinstein released the official version of their anti-encryption bill after a draft appeared online last week. The bill, titled the Compliance with Court Orders Act of 2016, would require tech firms to decrypt customers’ data at a court’s request. The Burr-Feinstein proposal has already faced heavy criticism from the tech and legislative communities and is not expected to get anywhere in the Senate. President Obama has also indicated that he will not support the bill.

 

Tony Scott: White House Proposes Federal IT Modernization Fund Bill

executivegov.comTony Scott The White House has proposed a bill that would create a $3.1 billion revolving fund to help Federal agencies update their legacy information technology systems and bolster the government’s cybersecurity posture. He added the bill would also establish an independent board of experts to help identify agency IT systems that face the highest risk for potential cyberattacks as well as strategies to facilitate adoption of common platforms and cybersecurity best practices across the government.

 

Editorial: VA’s Scheduling System Betrayal

Four months after a waiting list scandal forced the Secretary of Veterans Affairs Eric Shinseki to resign, I asked the new VA Secretary, Robert McDonald, if he thought the VA had enough money in the budget to procure a new commercial scheduling system to ensure that veterans could get the care they needed, when they needed.

“We tried to put the money we needed in the act that was recently passed. I can’t predict the future,” McDonald said in answer to my question during a press briefing at VA’s headquarters in Washington, D.C. “But I think we’ve done a good job of that. As we work through this scheduling system, we’re going to be very eager to find an off-the-shelf product that is proven effective. The off-the-shelf product will become very important as we move forward. I thought it was a brilliant piece of work by [Deputy Secretary Sloan Gibson] and the team to come forward and say, ‘we are going to take an off-the-shelf product.’ ”

Eighteen months and $11.8 million later, VA’s resolve is wavering. VA chief information officer LaVerne Council and Veterans Health Administration Under Secretary for Health, David Shulkin, told Congress Thursday that they are now unsure how or if they will proceed with the $624 million Medical Appointment Scheduling System (MASS) contract awarded last year.

“We want to be certain that continuous modernization of a 40-year-old electronic medical record is an appropriate decision,” Shulkin said.

Instead of moving out aggressively on a commercial system with a proven track record, Shulkin and Council did the unthinkable: They decided to develop their own in-house upgrade to the scheduling module of VA’s main electronic health system, known as VistA. That’s right—even after a major scandal involving deliberate manipulation of the scheduling system that led to the deaths of veterans, VA thought it was appropriate to tackle the development themselves.

VA is testing an intermediate VistA Scheduling Enhancement tool at two medical facilities. If the employees like it, VA will continue with the deployment.

As a veteran, such inept decision-making is infuriating. But what makes the situation worse is the fact that these decisions are being made because of money—the money that McDonald told me in 2014 was already in the budget for a new commercial system.

“The entire VSE rollout will cost taxpayers $6.4 million. If we roll out MASS, which is an absolute option for us, the pilot alone will be $152 million,” Shulkin said, explaining to lawmakers how VA was struggling to balance the needs of veterans with the agency’s duty to be good stewards of taxpayer dollars. “It will take us 10 months to roll it out in three sites, and that’s if VA stays on schedule with its pilots,” he said. “We have not ruled out MASS. I want to be absolutely clear about that.”

Shulkin and Council may not have ruled out a commercial replacement system through MASS, but that doesn’t explain what he said next about VA’s plans for the VistA Evolution pilot testing that is underway.

“I have the user evaluations, which are tremendous,” Shulkin said. “It’s planned to roll out to 11 VA [facilities] by the end of this month or the next six weeks, and then a national rollout.”

To his credit, Shulkin got one thing right when he acknowledged “we still have an access crisis” at the VA. Yes, something has to be done to improve the situation in the near term, and VSE is an important part of that. But to consider not moving forward with MASS—a comprehensive commercial health management system awarded under a competitive bidding process—is tantamount to reneging on the promise made to veterans that the scheduling and access problems will be fixed.

McDonald brought Shulkin and Council (who worked for McDonald at Johnson & Johnson) to the VA to inject new thinking into the bureaucracy. But they have failed. The VA is a bureaucracy with a culture beyond repair. The wounds of the scheduling scandal are not even close to being healed and VA’s leaders have handed the task of modernizing that system right back to the people who created it.

“This seems like déjà vu all over again to me,” said Rep. Ann McLane Kuster, D-N.H. “VA has already wasted nine years, $127 million without an update to its scheduling system, after finding a commercial product and abandoning that for an in-house solution that could not deliver an adequate update. We cannot and will not let this happen again.”

Senile Systems?

What was the first computer? George Stibitz’s Model K, Packard and Hewlett’s 200A, Alan Turing’s Colossus?  None of the above.  Believe it or not, the ancient Greeks beat the Geeks to the punch by more than 2,000 years.  Dating back to 205 B.C., the Antikythera Mechanism is an ancient analog platform that computed the orbits of the planets and the timing for the Olympics.

Hieroglyphics:

So, as the House Oversight and Government Reform Committee conducts its archaeological spadework to unearth ancient Federal IT systems – consider, things could be worse.  OGR asked the 24 Cabinet-level agencies for an audit of their legacy systems and migration plans by January 29.  Capitol Hill tells us many agencies missed the deadline – but most of the 24 have now submitted their reports.  Some interesting insight from Dave Powner, IT lead at GAO, at the MeriTalk FITARA Forum and Data Center Brainstorm.  Based on agency reports, agencies are running more than 10 million lines of COBOL and Assembler code – and these hieroglyphics power many mission-critical functions.  Powner pointed to the two-year average Federal CIO term as the enemy of real change.  IT execs look for quick wins – which means they avoid entering the mummy’s tomb.  Powner sparked that these ancient systems are either a huge cyber liability or as safe as houses – who’s writing viruses for this stuff?

 

Back away from the details, this is clearly a call for governmentwide leadership.  Keep an eye out for the May/June OGR hearing on Uncle Sam’s geriatric IT.  We’ll doubtless see some shocking examples – but to be clear, this is not about beating up on agencies, it’s about incentives and changing the failing status quo.  That said, everybody’s curious to see the details in the agency reports.  Here’s hoping OGR makes the data public.

(more…)

The Situation Report: Removing the Intelligence Community CIO’s Extra Hat

Photo: Director of National Intelligence James Clapper. (Credit: INSA)

Two Hats Are Not Always Better Than One

MeriTalk recently broke the news that the Office of the Director of National Intelligence is planning to hire its first chief information officer. Sounds pretty straightforward, but my Langley, Va., listening post has picked up strong signals that there is much more to the story and the timing of this new job search at ODNI.

Keen intelligence community observers will know that the ODNI traces its roots to the Intelligence Reform and Terrorism Prevention Act of 2004. But only the most sophisticated observers will recall the amendment that the first intelligence community CIO, Dale Meyerrose, succeeded in getting into the 2005 Intelligence Authorization Act. That bill, which became law on Dec. 23, 2004, included the following section:

‘(d) PROHIBITION ON SIMULTANEOUS SERVICE AS OTHER CHIEF INFORMATION OFFICER- An individual serving in the position of Chief Information Officer may not, while so serving, serve as the chief information officer of any other department or agency, or component thereof, of the United States Government.’

What does that mean? Well, we asked a few of our data scientists to run this through our decryption tools and it sounds like the ODNI—the center of gravity for intelligence policy in the post-9/11 era—has been dual-hatting the intelligence community CIO for the last decade.

“It may have been a benign oversight. Somebody in the general counsel’s office finally noticed,” said one Langley insider. It also means that it’s probably time to update Intelligence Community Directive 500, which sets forth the authorities for the IC CIO. Former DNI Mike McConnell signed Directive 500 in 2008. The job posting for the new ODNI CIO position, however, remains extremely vague in terms of how the two CIOs are to coordinate their responsibilities—a critical step to ensuring the survival of the Intelligence Community Information Technology Enterprise initiative, known as ICITE.

Protecting ICITE

There’s no question that the ODNI has gone through some growing pains, especially since it has faced an uphill battle against an army of doubters who were often very vocal in their opposition to the need for such an office. But Director of National Intelligence James Clapper has done an amazing job of elevating one of the most pressing issues facing the IC—establishing a common computing and information sharing environment that will enable rapid collaboration and decision-making.

“There is a desperate effort underway to ensure that ICITE survives the election and presidential transition,” our Langley source acknowledged. “ICITE has become the singular technological artifact of the DNI, yet there is no enterprise architecture, there is only ICITE,” our source said. “They’re hoping that by January 2017 ICITE and enterprise architecture are one and the same thing. So anything that works toward that goal is critical. Making sure the ODNI is a going concern as a hub entity with an empowered CIO is part of that. They’re going to try to fix that quickly.”

Quickly indeed. The ODNI CIO job announcement closes April 15 and plans call for a new CIO to be on board within 30 days.

Send your Situation Reports in confidence to dverton@meritalk.com

Look Who’s MeriTalking: Dell Federal’s Jeff Hogarth

Jeff Hogarth is the senior sales director for Dell’s Federal Civilian Business Unit.

MeriTalk caught up with Hogarth at this year’s FITARA Forum and Data Center Exchange Brainstorm in Washington, D.C., on March 30, for this special video edition of Look Who’s MeriTalking.

Watch Now

The Weekend Reader – April 8

Overnight Cybersecurity: Obama to Review Encryption Bill

thehill.comPresident Obama’s briefing means the bill will not be released this week, as Sen. Richard Burr hoped. Meanwhile, the White House on Thursday denied reports that it will not offer its support to the bill.

 

 

 

Will Federal Data Center Construction Freeze Benefit Colocation Providers?

datacenterknowledge.comWill the latest White House freeze on data center expansion and construction by Federal agencies accelerate colocation and cloud deployments? In February 2011, the “Cloud First” initiative required Federal agencies to evaluate their technology sourcing strategies so that cloud computing options were fully considered. It stressed the importance of each Federal agency migrating the majority of their data to cloud-based servers by 2015.

 

Government Primes Federal Cybersecurity Reporting Rules for Insurers | Healthcare Dive

healthcaredive.comThe Federal government’s Office of Personnel Management has announced plans to introduce new data breach reporting rules for health insurers that cover Federal employees, according to a Nextgov report.  Director Beth Cobert argued given the breaches at OPM and other insurers and providers, the government and its partners must coordinate efforts to keep their data secure. The rules echo draft guidelines issues by the White House last August, Nextgov notes, that aim to standardize cybersecurity incident reporting among contractors that store Federal data on third-party systems.

MIT Open Data Portal: A One-Stop Shop for Federal Government Data?

govtech.comCombing through Federal data has typically been a daunting affair. The lofty claim is delivered about a site that aggregates Federal open data from multiple sources and displays it in interactive visuals — colorful charts, maps, profiles and even a few pieces of data-based journalism. Unlike scores of citizen analytics sites before it, Data USA embraces the role of data curator and — with minimal nudges — guides its visitors to create actionable data insights.

 

Vast Majority of Federal IT Professionals Feel Their Agencies are at Risk

cybersecuritybusiness.comNinety percent of IT professionals in the Federal government feel their organizations are vulnerable to a cybersecurity attack, according to a recent report by Vormetric. The numbers are disconcertingly high considering they come from professionals tasked with protecting the confidential information of millions of Americans as well as the classified information from certain Federal programs and policies. Despite those high numbers, nearly 60 percent of responding government IT professionals believe their network defenses are “very” effective at safeguarding data, a number the report notes is notably more optimistic than their private-sector counterparts; the U.S. average is 53 percent.

 

The Situation Report: Federated IDs For Health IT and Insider Threat Update

Talent Crunch

Just how tough has it become for Federal agencies to find skilled technical talent? It’s become so tough, in fact, that the National Security Agency is collecting resumes from “former civilian affiliates” who have the necessary skills, experience, and security clearance to help the agency “augment the existing work force on high priority projects or programs.”

Federated Identity Pilot

My Gaithersburg, Md., listening post has picked up signals coming from the National Strategy for Trusted Identities in Cyberspace that a new pilot program is underway to demonstrate the use of federated online identity technologies for use by hospitals and patients.

“Currently, patients and providers need to obtain new identity credentials to access health information at different organizations,” according to the Federal funding opportunity announcement intercepted by the Situation Report.

“Technologies exist to streamline authentication to web portals today but mostly exist to service one organization only. The goal for this project is for hospital systems to work with other regional health systems and provider groups to operationalize the acceptance of federated identity and operate the pilot for at least six months. The use cases for this pilot would involve a federated credential solution that allows patients and health care providers to use the same credential with at least two healthcare organizations.”

Applicants must apply via Grants.gov by June 1. NIST said it expects to start the $750,000 to $1 million pilot project by Oct. 1.

Insider Threat Updates

My E-Ring listening post at the Pentagon reports that the Defense Department continues to make significant progress on its insider threat detection program and the intelligence community’s new continuous evaluation (CE) effort. Reports indicate that DOD has expanded its CE program to 225,000 employees, and is on track to reach 500,000 personnel this year.

But the overall Federal effort to establish Insider Threat Programs at the agency level by December 2016 is at risk, according to the latest cross-agency priority goal quarterly update, obtained by the Situation Report. The National Insider Threat Task Force has missed all three key goal dates for establishing programs at the agency level.

“Most of the executive branch departments and agencies have accomplished program establishment tasks. Many departments and agencies are discovering challenges with issues such as organizational culture, legal questions, and resource identification, to name a few,” the report states.

In addition, it’s still taking much longer than it should to obtain a security clearance. For example, the 82,186 secret-level security clearances initiated during the first quarter of 2016 took an average of 116 days—that’s 42 days longer than the intelligence community would like. The 17,100 initial top-secret clearance requests took an average of 203 days to complete—89 days longer than the goal set by officials.

security clearance timeline

Migrant Intelligence

The real Donald Trump (@realDonaldTrump) will be happy to learn that my Oval Office listening post has picked up strong signals that the National Security Council is taking steps to address the intelligence challenges related to screening migrants. Monte Hawkins, an intelligence officer and NSC staffer, has been tapped to be the senior adviser for migrant screening and vetting on the NSC staff.

L.A. Gets New Deputy CIO

Federal Cloud Procurement: What You Need To Ask

We all agree cloud consumption is inherently more efficient – helping agencies shift from CapEx to OpEx – and more flexible – enabling “anything as a service,” where agencies pay for what they use vs. what they project.

The plan is to use cloud to speed the Federal modernization path – a key goal considering just 32 percent of Federal IT managers anticipate their legacy applications will be able to meet mission needs in five years.

That said, Federal cloud transitions are moving slower than expected due to a series of challenges – with security, data governance, and procurement at the top of the list.

Fortunately, GSA is taking steps to address priority #1 – security – with plans to introduce a major reform effort to the FedRAMP program.

With cloud procurement, there are hundreds of Federal contracts and many factors to take into consideration.  So – what’s the best path? Upfront planning and staying cognizant of all steps involved in the cloud transition will provide agencies with an improved pathway to the cloud.

The first question:  What problems are you trying to address?  Are there mandates unique to your agency?  Geographic data storage requirements?  Do you plan to share the technology or dedicate it to a single organization?

Identify what you need and choose the right cloud model.  Infrastructure-as-a-Service (IaaS) is ideal for agencies that need to directly maintain the infrastructure and data.  Platform-as-a-Service (PaaS) is ideal for teams who will be developing and delivering new applications or wanting to accelerate deployment of new capabilities

Remember:  One cloud does not fit all. Agencies do not have to pick “one cloud” (i.e., public or private).  Agencies can adopt a hybrid cloud approach, allowing them to direct workloads as appropriate to commercial or private clouds.

FITARA means Federal CIOs need improved transparency and visibility into their IT environment, and a cloud move can be a key step.  ViON works with agencies to deliver a portal through which they can see their full environment, automate service provisioning, and access real-time consumption, funding, and billing data.

On the procurement side, ViON is a prime on key Federal contracts – GSA, NIH CIO, and NASA SEWP – simplifying the cloud procurement process.  We eliminate the need for RFIs and RFPs, as we’ve already completed vendor evaluations.  As a result, we can significantly simplify and expedite the procurement process.

We take on the upfront investment risk of hardware and software purchases, eliminating the burden on the agency and smoothing the cloud transition so agencies can lower cost, reduce risk, and easily scale to meet new mission requirements, quickly.  That’s the real silver lining.

Learn More:

Business of Cloud eBook

ViON Agile Cloud Solution Portfolio

 

This blog post was originally published here.

A Break in the Clouds? Taking a Business Model Approach to Clear a Path to Federal Cloud Adoption

Federal IT leaders have the best possible cloud intentions – from Cloud First and Shared First to FDCCI and FedRAMP. And, there is motivation. By most accounts, 80 percent of Federal IT dollars are currently spent on life support for legacy systems – an equation that needs to change.

This month GAO and OMB provided additional motivation.

GAO reported 22 of 24 agencies show a “lack of progress” in achieving OMB’s goal to close at least 40 percent of all non-core data centers by the end of 2015.

OMB released a new Data Center Optimization Initiative (DCOI) on March 3, superseding FDCCI and taking further steps in the push to transition to more efficient infrastructure, such as cloud services and inter-agency shared services.  Starting in 180 days, the memorandum advises agencies to not budget funds or resources toward a new data center or significantly expand an existing data center without approval from the OMB OFCIO.

Laying out the path forward, OMB says agencies will evaluate options for the consolidation and closure of existing data centers by (in order of priority):

  1. Transitioning to provisioned services, including configurable and flexible technology such as Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), to the furthest practicable extent, consistent with the Cloud First policy.
  2. Migrating to inter-agency shared services or co-located data centers.
  3. Migrating to better optimized data centers within the agency’s data center inventory.

Good intentions aside, the lack of progress in getting IT workloads into the cloud raises red flags about agencies’ ability to meet data center consolidation goals.

There are legitimate concerns – from cost uncertainty to CSP stability to data security/availability to migration issues.  The big question – how can agencies overcome the headwinds and accelerate the cloud transition?

At ViON, we take a straightforward “business model” approach, packaging infrastructure with a proprietary cloud financial model and 24/7/365 enterprise class support, professional services, and managed services.

We know there are existing applications that will not work well (or at all) in a commodity infrastructure public cloud.  We also recognize it will take significant time and resources to re-platform these applications.  So, we work with agencies to overcome these challenges and enable a cloud business model now, for all applications, not just the small subset that can survive on commodity cloud.

Technology is delivered via a pre-integrated, shared infrastructure pool, providing access to needed resources without over-buying – you pay for what you use.

At the core, agencies need a business model approach that reduces the risk of the cloud transition and ensures you achieve desired efficiency goals – so you can innovate.  That’s the real silver lining.  You don’t have to wait until the application is re-architected.  You can limit risk and position your agency for success in the cloud today.

Learn more – read our new white paper, “The Business of Cloud.”

 

This blog post was originally published here.

Make Cybersecurity a Strong Foundation–Not a Flimsy Add-on

After serving 45 years in Federal service, it still amazes me that cybersecurity is treated as an add-on, focused on preventing access to the network, rather than as an integrated part of the foundation, working at every level of our systems to protect the data.

We have said for decades that security can’t just be a bolt-on feature and that it needs to be considered as part of the architecture and foundation. Part of the problem is that security implementation is still way too hard. It is still treated by vendors as an option that has to be turned on and integrated, and the default is to install software without security turned on. This is just plain wrong.

This may have made sense years ago, when implementing full security for classified systems could consume 20 percent to 30 percent of your resources, which might not work for high-performance systems. Nowadays, the security impact tends to be 2 percent or 3 percent, and no CIO is going to risk their carrier by turning it off without a lot of thought.

We have also discovered that security is needed in virtually all systems to protect privacy, financial, and commercial data, not just classified government data. Systems should be installed with security options turned on by default. CIOs and system administrators should have to choose to turn off security.

Vendors also need to simplify security implementation, so that it is much more plug-and-play, based on major security categories. Start with options for highly classified Security Technical Implementation Guides (STIG), then government restricted, financial, and health care/privacy, and other; and last, give an option for NO security (not recommended). That is around five options, not thousands of variables.

Installations should start by looking for the Identity and Access Management (IDAM) software and then inheriting security settings from it. Then move on to Role Based Access Management/Attribute Based Access Controls (RBAC/ABAC). All systems should have Privileged User Management Access (PUMA) controls in place, and database administrators should be able to see data only by exception. We have to stop relying on edge protection at the network level and build security in at every level.

Data should be protected at rest as well as in flight by encryption and dynamic Virtual Private Networks (VPN) at every level. Why do we continue to treat the network like an open party line that any device can listen in to? If the packet is not for you, then only the packet header should be visible, not all the data.

We also need continuous security monitoring that validates that the security is on and that it continuously meets security standards, not just when it is installed.

Security should also be baked in at the chip level, as Intel is starting to do. To make our systems secure and protect our digital futures we need to bake security in at all levels, simplify the installation and maintenance, and protect our data by default. Why do we build security so that it is so complex that almost no one can get it right? Remember the DVRs that constantly blinked the wrong time because few average people could figure it out? Now they install correctly by default and take the time from the power line. What we need is security for dummies, so that we can all get it right.

Let’s fix browsers so they are secure and can’t let in malware. Let’s stop the insider threat by limiting access and constantly monitoring usage. Let’s secure the Internet of Things (IOT) by building security into everything to include sensors, peripherals, devices, appliances, and vehicles.

My plea to vendors is “enough already”: Stop making security hard to implement and build it into the foundation of your systems. To government and commercial CIOs, start holding vendors accountable. If they do not build security into the DNA of their products, find a different vendor.

Always consider the cost of installing, maintaining, and operating your security as part of the total cost of ownership. How long are you going to keep your job if you have a major data breach? Don’t skimp on security, demand it as a starting point. It is time we all got serious about cybersecurity.

About the Author

Kenneth M. Ritchhart is vice president of Business Development & Strategic Planning at Oracle Public Sector.

The Weekend Reader – April 1

MIT Team Wins $75M in Federal Funds for Research in High-Tech Fabrics

bostonglobe.comA consortium of colleges and businesses led by the Massachusetts Institute of Technology has won a national competition to host a novel federally funded research program to turn clothing fibers and fabrics into wearable electronic devices, officials are expected to announce Friday. Clothing fibers could be designed to change color, monitor health, or even store energy.

 

Google ‘Mic Drop’ April Fools’ Gag Goes Horribly Wrong

fortune.comGoogle ended up with egg on its face after this year’s April Fools’ joke caused some Gmail users to insult contacts and, some claimed, lose employment opportunities. The “joke” was an actual feature that Google added to Gmail, called “Mic Drop.” An orange button next to the standard blue “send” button allowed people to send their email with an animated image of a Minions character dropping a microphone. Outraged Gmail users who use the service for professional purposes flooded Google’s product forums to complain about having accidentally clicked the button on important work mails.

Here’s Snapchat’s Latest Report on Government Requests For Data

fortune.comAs it continues to grow, ephemeral messaging app Snapchat is receiving an increasing number of government requests for user data. On Tuesday, the Venice, Calif., company published its latest report on these requests, which it does every six months. Between July 1 and Dec. 31, 2015, Snapchat received a total of 862 criminal legal requests from U.S. government entities, up from 761 in the six months before that.

 

Tech Giants, Government Struggle with Online Speech Policies

cio.comAs social media outlets increasingly become the favorite channels for terrorist groups to spread messages of violence and recruit new members, the Internet companies that maintain those services are in a tough spot. Companies born on the Web like Google and Facebook promote an ethos of free speech, but at the same time recognize the dangers of terrorists, criminals, and other bad actors co-opting their platforms in service of a violent ideology or illegal activities.

 

Can the IRS Protect Taxpayer Data? Government Accountability Office Raises Concerns

bostonglobe.comJust in time for tax season, the Government Accountability Office is warning that weak financial controls at the Internal Revenue Service leave taxpayer information at risk.
In a report released this week to IRS Commissioner John Koskinen, the GAO noted the agency’s progress in information security but said ‘‘weaknesses in the controls limited their effectiveness in protecting the confidentiality, integrity, and availability of financial and sensitive taxpayer data.’’

 

 

The Situation Report: CIO Power Struggles & Data Center Disasters

FITARA Fault Lines

My Capitol Hill listening post has picked up several encrypted messages from the Government Accountability Office suggesting that the Federal Information Technology Acquisition Reform Act may be widening some major fault lines across government. While the law was designed to strengthen the role of the chief information officer, the Situation Report has picked up signals that some agency CIOs continue to be bulldozed by their CxO counterparts.

CFOs eat CIOs for lunch,” according to one open-source report.

“CIOs are being pressured to sign off on FITARA plans they don’t agree with,” said Dave Powner, Director of IT Issues at the GAO. “We need stronger CIOs across the board.”

Meanwhile, the fainthearted have retreated to the relative safety of complaining that FITARA is yet another compliance exercise designed to be an innovation speed bump. Indicators are strong that the old compliance checklist mantra isn’t going to work in this case. What is working, say insiders, is the slow but steady culling of feckless leaders.

There may be another, less treacherous, route to FITARA success—giving agencies the resources they need to remain in compliance with the law. Strong signals coming from the Agriculture Department indicate that FITARA done correctly required seven full-time employees and $3 million-plus during the first year. Since new funding wasn’t in the cards, USDA reverted to volunteers.

Get Ready For Agency Self-Assessments

Not. Remember those FITARA Self-Assessments that agencies were supposed to send to the Office of Management and Budget by April? Well, most of them are in, but my OMB listening post has picked up solid evidence that the White House has no plans to make those assessments public.

Data Center Disasters

data centersThe Federal government continues to make progress in its effort to close data centers. The Federal Data Center Consolidation Initiative (FDCCI) has spearheaded the closure of 3,125 of the 10,584 data centers that we know about. But leave it to OMB to redefine what a data center actually is and this is what you get: the very real possibility that the 10,584 data centers we currently know about could increase.

“We are nowhere close to optimizing our data centers,” according to Powner, who plans to release a status report in late May.

Congress also has plans to take a hard look at the government’s legacy system closure rate. My Capitol Hill remote outpost reports that most agencies have no plans for the millions of lines of old code still in existence. “Far less than half of those old systems have a plan for replacement,” according to one Hill watcher.

Cyber ISR Boost

Thanks to the Federation of American Scientists, the Situation Report got its hands on a heavily redacted copy of the Fiscal 2016 Congressional Budget Justification Book for the Military Intelligence Program. The 178-page document contains a few nuggets of back channel intelligence worth noting.

The Air Force National Guard is realigning up to 258 personnel to stand up a Cyber Intelligence, Surveillance, and Reconnaissance Group in Massachusetts, and another 89 members to stand up a similar squadron in California. Total cost: $19 million.

Endpoint Enemies Beware – Ensuring Endpoint Security

The volume and variety of endpoints is growing, as more and more devices connect to Federal networks. Feds are worried security can’t keep up.

A recent MeriTalk report estimates 44% of endpoints that access Federal agency networks are at risk. And nearly one-third have experienced breaches via endpoints.

1 11 12 13 14 15 20