2.3.16

Managing agile virtual machine security across the enterprise: A closer look


Moving security to different digital intersections may serve to reduce the load on the endpoint – thereby avoiding duplicate scans, say, during a malware storm.
However, it is just as important to understand how and when an agile approach to deploying your network defenses in real-time should be performed, and how attacks might dictate that approach. Here we look at best practices, and striking a balance between network load, endpoint load and attack defense agility.
No two attacks are alike. If you have a server room full of payment-processing physical and VM servers, you deploy a very different mix of security tools (or should) than someone deploying thin clients for a call center, and (hopefully) different defense methodologies.
Increasingly, VM environments are housing a broad mix of machines, so you might have a few servers full of accounting database servers, a few servers of Windows desktop VMs, and a smattering of other VMs to round out the enterprise. This is where the need for agility applies.
For example, with VMWare’s vShield App and Endpoint, you can route potentially suspicious traffic across virtualized networks to VM host servers with lots of power for enterprise scanning, and then add and remove endpoints from that pool dynamically, as traffic dictates.
This kind of rethinking about enterprise deployment requires us to reconsider enterprise security in a different context, since not only is moving VMs around the enterprise de rigueur, moving entire networks ‘on the fly’ is as well, and this means, in turn, that tracking the changes and keeping an appropriate security defense layering schema current becomes much more nuanced, but also much more important. Add that to rolling out Software Defined Networks (SDN), and your enterprise becomes very agile indeed.
But with both network and host agility across a dynamic environment, mistakes are easy to make. Knowing the state of all your endpoints and networks – especially across datacenters – means dashboarding and snapshots (and versioning to be able to replay configuration steps) become paramount. Do you know what the parameters of your perimeters, networks, and clusters of endpoints’ security look like right now? If not, you’re not alone.
Not to worry though. Last year at VMWorld there were many presentations about the real pain and suffering that can accompany a migration to this type of architecture (along with accompanying workarounds). And while you may understand security of old static systems, it is not obvious that you will be able to manage a more dynamic environment until you learn and understand the tools and can fully understand each of the missions behind groups of VMs scattered around the enterprise.
So it’s best to roll out a small mockup of the eventual architecture you want to migrate to, and even create some pseudo-real workloads (of non-critical tasks) and spin it all up and watch what happens. In this way you can establish a Phase A mockup to stage, then roll to a Phase B, which is exposed to more traffic and more potentially hostile traffic. During this exercise you can start instrumenting and tuning your sensors for the right amount of threat intelligence for a given environment. Then, when you are ready to move into production, you’ll have a fairly good idea of what the pinch points and strengths are surrounding your system. You’ll also know what loads different systems can handle, and where best to locate your security sensors.
Ten years or so ago when virtual machines were in their nascent and emergent forms, no one thought there would be a strong need to engage in this level of management. But in today’s environment, when the technology has proven itself in heavy, continuous and continually changing environments, you may want to think twice about ramping up a full production environment without really understanding your security stance, and you only get there by testing, not just fire-and-forget.


Dé trends in digitale transformatie: Hybride Integratie en APIs



Digitale transformatie vindt overal plaats. Het was dé trend van 2015 en zal ook in 2016 voor de meeste organisatie een top-thema zijn. Volgens Wilfred Harbers, Director Digital Solutions Consulting Benelux bij Software AG, komt dit mede doordat consumentengedrag voor veranderingen blijft zorgen en een vraagteken achter elk businessmodel zet. Welke ontwikkelingen op het gebied van software integratie en APIs zien we en hoe kunnen organisaties daarmee hun digitale transformatie in 2016 effectief ondersteunen? Software AG zet acht ontwikkelingen op een rij:

1.    Alles wordt hybride
De complexiteit van cloud-adoptie en de behoefte aan meer innovatie om online apps te kunnen bouwen, dwingt IT om verschillende cloud-opties te verkennen. Bedrijven willen een hybride cloud, hybride integratie en zelfs een managed cloud. Bedrijven willen naast de gangbare publieke cloud-oplossingen nu ook over gaan op echte hybride oplossingen. In plaats van alleen te focussen op publieke en private cloud-opties, zal IT andere modellen gaan verkennen voor meer flexibiliteit en controle.

2.    De rol van Swagger in API-ontwikkeling wordt groter
Het Swagger API framework wordt de facto standaard en initiatieven zoals Open API standaardiseren de rol van Swagger in API-ontwikkeling. Vendoren zullen zich achter de Open API-initiatieven scharen en Swagger een broodnodige boost geven. Hierdoor groeit het uit tot de meest gebruikte API-standaard en RAML (RESTful API modeling language) zal na verloop van tijd uit het zicht verdwijnen.

3.    API’s maken ‘self-service’-integratie mogelijk
Beeld je een wereld in waar alles een API is en al je data direct toegankelijk is voor jou en degenen die jij toestemming geeft. Hiertoe behoren ook je klanten, partners, leveranciers, banken en bijna iedereen in je ecosysteem. Hoe beheer je deze gecompliceerde wereld van datatoegang? Hiervoor is een robuust API Management Platform nodig dat verder gaat dan ‘self-service’ API Portals en ook veiligheid, toegang, beheer, service-architectuur, governance, monitoring, en integratie ondersteunt.

4.    Bimodal IT wordt mainstream
Ontwikkelaars, business integrators en DevOps-structuren zullen nog beter samenwerken om nieuwe applicaties en interfaces te leveren en het totale innovatiequotiënt van de organisatie te versnellen. Verschillende integratiemodellen komen samen in het kader van IT en versterken de organisatie.

5.    Integratie maakt de verborgen meerwaarde van Big Data zichtbaar
Bedrijven maakten tot nu toe gebruik van Hadoop-platformen voor het opslaan van datasets zonder de volledige waarde te benutten. De toenemende integratie van bestaande systemen met nieuwe Hadoop-platformen zullen de verborgen waarde zichtbaar maken. Hierdoor kan Big Data eindelijk worden gebruikt om slimme beslissingen te maken om bijvoorbeeld de klantervaring te optimaliseren. 

6.    Microservices vervangen monolithische architecturen
Microservices nemen langzaam de plek in van monolithische architecturen. Nu organisaties aan de slag gaan met digitale transformatie, beseffen zij zich dat de grootste obstakels voor snelle innovatie de monolithische legacy-architecturen zijn en vinden zij manieren om meer ‘DevOps’-vriendelijke architecturen te implementeren, gebaseerd op microservices.   

7.    MDM en IoT event-afhandeling komen bij elkaar
De steeds bredere definitie van Customer-360 zorgt ervoor dat Master Data Management-platformen de data over het koopgedrag van klanten gebruiken door deze te linken aan standaard MDM-datakwaliteitsprocessen, zoals cleansing en matching. Bijvoorbeeld, het synchroniseren en verrijken van het klanten master record met gecorreleerde sensordata wordt een vereiste voor het MDM-consolidatieproces van klantgegevens. 

8.    MDM wordt mooier en slimmer
MDM-oplossingen blijven medewerkers die de rol van ‘data steward’ hebben ondersteunen bij het vereenvoudigen van het schoon en ‘fit’ houden van de data, door geïntegreerde data, business intelligence-tools en ‘dash-boarding’ samen te voegen. Voor organisaties gaat hierdoor de waarde en impact van superieure datakwaliteit in hun bedrijfsprocessen omhoog.

“Dit jaar ligt de focus op het richten en inrichten van de benodigde IT-capaciteiten voor een snellere transformatie. Organisaties beseffen zich dat de bestaande modellen maar deels in staat zijn om deze transformatie te ondersteunen. Onze voorspellingen samengevat gaan over hoe organisaties betere business applicaties kunnen samenstellen, bouwen en uitrollen. Dit zal immers bepalen hoe zij zich kunnen transformeren tot digitale ondernemingen”, aldus Harbers.


New Ponemon Study: Cyber Onslaught Threatens to Overwhelm Healthcare Survey


ESET®, a global pioneer in proactive protection for more than two decades, and the Ponemon Institute, a privacy and information management research firm, today announced results of The State of Cybersecurity in Healthcare Organizations in 2016 (February 2016). According to the study, healthcare organizations average about one cyber attack per month. Almost half (48 percent) of respondents said their organizations have experienced an incident involving the loss or exposure of patient information during the last 12 months. Yet despite these incidents, only half indicated their organization has an incident response plan in place.
"The concurrence of technology advances and delays in technology updates creates a perfect storm for healthcare IT security," said Stephen Cobb, senior security researcher at ESET. "The healthcare sector needs to organize incident response processes at the same level as cyber criminals to properly protect health data relative to current and future threat levels. A good start would be for all organizations to put incident response processes in place, including comprehensive backup and disaster recovery mechanisms. Beyond that, there is clearly a need for effective DDoS and malware protection, strong authentication, encryption and patch management."
Key findings of the survey:
·         Exploiting existing software vulnerabilities and web-borne malware attacks are the most common security incidents. According to 78 percent of respondents, the most common security incident is the exploitation of existing software vulnerabilities greater than three months old.
·         On average, organizations have an advanced persistent threat (APT) incident every three months. Respondents experienced an APT attack about every three months during the last year. Sixty-three percent said the primary consequences of APTs and zero-day attacks were IT downtime followed by the inability to provide services (46 percent of respondents), which create serious risks for patient treatment.
·         Hackers are most interested in stealing patient information. The most attractive and lucrative target for unauthorized access and abuse can be found in patients' medical records, according to 81 percent of respondents.
·         Healthcare organizations worry most about system failures. Seventy-nine percent of respondents said that system failures are one of the top three threats facing their organizations. This is followed by cyber attackers (77 percent) and unsecure medical devices (77 percent).
·         Technology poses a greater risk to patient information than employee negligence. The majority (52 percent) of respondents said legacy systems and new technologies to support cloud and mobile implementations, big data and the Internet of Things increase security vulnerabilities for patient information. Respondents also expressed concern about the impact of employee negligence (46 percent) and the ineffectiveness of HIPAA-mandated business associate agreements designed to ensure patient information security (45 percent).
·         DDoS attacks have cost organizations on average $1.32 million in the past 12 months. Thirty-seven percent of respondents say their organization experienced a DDoS attack that caused a disruption to operations and/or system downtime about every four months. These attacks cost an average of $1.32 million each, including lost productivity, reputation loss and brand damage.
·         Healthcare organizations need a healthy dose of investment in technologies. On average, healthcare organizations represented in this research spend $23 million annually on IT; 12 percent on average is allocated to information security. Since an average of $1.3 million is spent annually for DDoS attacks alone, a business case can be made to increase technology investments to reduce the frequency of successful attacks.
"Based on our field research, healthcare organizations are struggling to deal with a variety of threats, but they are pessimistic about their ability to mitigate risks, vulnerabilities and attacks," said Larry Ponemon, chairman and founder of The Ponemon Institute. "As evidenced by the headline-grabbing data breaches over the past few years at large insurers and healthcare systems, hackers are finding the most lucrative information in patient medical records. As a result, there is more pressure than ever for healthcare organizations to refine their cybersecurity strategies."
You can access the survey report here: http://business.eset.com/cybersecurity-healthcare-survey/.
You can read more insights from Stephen Cobb and learn more of the survey's findings in this post:
New Ponemon Study: With Cybercrime Still on the Rise, It's Time to Take Action.
Methodology 
The State of Cybersecurity in Healthcare Organizations in 2016 surveyed 535 IT and IT security practitioners in small- to medium-sized healthcare organizations in the U.S. Sixty-four percent of respondents are employed by HIPAA covered entities, 36 percent by business associates of covered entities. Eighty-eight percent of organizations represented in this study have 100-500 employees.
About Ponemon Institute
Ponemon Institute conducts independent research and education that advances information security, data protection, privacy and responsible information management practices within businesses and governments throughout the world. Our mission is to conduct high quality, empirical studies on critical issues that affect the protection of information assets and IT infrastructure. As a member of the Council of American Survey Research Organizations (CASRO), we uphold strict data confidentiality, privacy and ethical research standards. www.ponemon.org.


29.2.16

The security review: porn clicker trojans at Google Play



Welcome to this week’s security review, which includes detailed commentary and analysis on porn clicker trojans at Google Play, digital childhoods and the industrialization of cybercrime.
Porn clicker trojans at Google Play
At the time of writing, ESET has found 343 malicious porn clicker trojans on Google Play, which is describes as one the largest malware campaigns on the app store. An analysis by malware researcher Lukáš Štefanko revealed that many Android devices have been infected. He later noted in an interview with We Live Security that despite the commendable efforts by Google’s security team, the malware’s authors are proving to be hard to combat. His colleague, the security evangelist Peter Stancik added that “the creators of these trojans ride the wave of interest in popular applications, notably in games”.
Digital childhoods and internet-savvy countries
Security evangelist Ondrej Kubovič reported on what parents in the UK, US, Germany and Russia think is the most appropriate age to introduce ‘digital activities to their children’. Surveys carried out by ESET found that Russian parents tend to be stricter with their children under the age of five, while moms and dads in the other sampled countries were found to be far more relaxed. His analysis of the results also revealed consensus – all four countries agreed that children have access to technology and the internet far too early.
Linux Mint site hacked, users unwittingly download backdoored operating system
Independent security analyst Graham Cluley drew attention to a compromised version of the Linux Mint operating system, which is “playing host to a Linux ELF trojan called Tsunami”. This, he elaborated, has the ability to launch distributed denial-of-service (DDoS) attacks, as well as steal files from your computer. He said: “If I were a user who might have had their personal information exposed, or their computer compromised, I wouldn’t be wasting any time taking action to ensure that any damage was limited.”
The industrialization of cybercrime may be upon us
The “industrialization” of cybercrime is now a very real thing, claimed Dr. Adrian Nish, head of cyber threat intelligence at BAE Systems. He explained that the criminal activity is becoming increasingly “professionalized”. The Telegraph, which quoted him, also reported that the multinational defence, security and aerospace company has to constantly fight cyberattacks on a weekly basis, highlighting how serious a problem it has become. One of the ways around this is to understand what motivates cybercriminals, said his colleague, Kevin Taylor, head of applied intelligence.
Privacy and security ‘war’ must come to an end
Government officials and cybersecurity experts have joined forces to help settle the ongoing and deeply divisive privacy/security debate. The Digital Equilibrium Project has been set up to help “foster a new, productive dialogue on balancing security and privacy in the connected world”. Art Coviello, former executive chairman of RSA and organizer of the Digital Equilibrium Project, said that the “standoff between Apple and the US government is a symptom of a larger issue”, explaining that laws, policies and “social constructs” need to catch up with the pace of technological change.


                          www.eset.lu

There will be no online voting on Super Tuesday. Here’s why


Super Tuesday is still offline. Where’s the catch?
Online era doesn’t apply to most elections. This is the catch
Is virtual world safe for elections? Not as safe as for payments

America’s presidential election is fast approaching and the nation, along with the rest of the world, is waiting to see who will be chosen to run for the White House. Donald Trump for the Republicans? Hillary Clinton or Bernie Sanders for the Democrats? March 1st, also known as Super Tuesday, may hold the answers: it’s the day when the largest number of US states hold their primaries.
Yet even though the United States is among the most technologically advanced nations in the world, most voters there cannot cast their ballots online. This is despite the fact that nowadays we can do pretty much anything in the virtual world: work, entertainment, paying bills or buying things are part of our everyday online lives.

So is Internet voting really such a risk? And if so, where’s the catch?
There are actually several of them. First of all, cyber space isn’t actually as safe as everyone thinks, not even for banking or payments for online shopping, if you’re not properly protected.

E-commerce and online voting don’t compare
The upside is that potential fraud affects only a small portion of all online transactions. Due to this, online merchants, banks and big companies can ‘hide’ the costs that the victims of fraud would normally have to pay. The rather unpopular downside is that everyone ends up covering these losses in the form of fees or higher prices.
But this approach doesn’t apply to online voting. Who would pay for the damage done by electoral fraud? And what would be the mechanism to fix glitches, especially if they were uncovered years later? Making things even worse, voting is anonymous, so by design there should be no way to find out who rigged the results or who cast the fraudulent ballots.
Unlike an ‘old-school’ election, there is no paper trail in cyberspace and trying to achieve something similar might prove difficult. The metadata could easily be corrupted or manipulated, without leaving a trace. And let’s not forget that avoiding detection is a specialty of most types of malware.

Cheap cybercrime v. big money in the elections
It’s also worth mentioning that other cyber threats can mess with the electoral process, such as an army of zombie computers – a.k.a. botnets – that could overload an official election webpage or, even worse, cast thousands of ballots in favor of a preselected candidate. If the cyber criminals were skilled enough, they could actually do everything via victims’ computers.

In this equation, the price of malware is a considerable factor too. Its costs are low compared to the potential gains from manipulating an election. It might take as little as tens of thousands of dollars to rig an outcome, which is negligible compared to the vast sums invested in campaigns. Then there is the fact that some parties want to win very badly and other big players, such as corporations or other nation states, might also feel tempted to influence the final result.