Subscribe: Reg Whitepapers
Added By: Feedage Forager Feedage Grade B rated
Language: English
application  based  cloud  data  enterprise  information  iot  network  protection  research  security  storage  systems  tools   
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Reg Whitepapers

Reg Whitepapers

Biting the hand that feeds IT

Copyright: Copyright , Situation Publishing

Hyperconverged Infrastructure as a Catalyst for Change

Many IT teams are dogged by infrastructure complexity and heavy reliance on manual processes and home-grown scripts.

Newer technology options such as hyperconverged infrastructure can help to simplify and inject more automation into the IT systems environment.

But this has knock-on effects in terms of skills requirements and the nature of future operations work, and both IT leaders and practitioners at the sharp-end need to be prepared.

The Effectiveness of Tools in Detecting the 'Maleficent Seven' Privileges in the Windows Environment

Windows privileges add to the complexity of Windows user permissions. Each additional user added to a group could lead to a domain compromise if not evaluated. Privileges can override permission causing a gap of perceived effective permission. Currently, system administrators rely on tools such as Security Explorer, Permissions Analyzer for Active Directory, or Gold Finger help with this problem. An analysis of these three tools that are supposed to help with permissions is needed to provide administrators a window into these complex effective permissions. The results of this research discovered a gap in identifying users with privileges with the current tools available. This gap was filled by the author by using powershell.

Active Defense via a Labyrinth of Deception

A network baseline allows for the identification of malicious activity in real time. However, a baseline requires that every listed action is known and accounted, presenting a nearly impossible task in any production environment due to an ever-changing application footprint, system and application updates, changing project requirements, and not least of all, unpredictable user behaviors. Each obstacle presents a significant challenge in the development and maintenance of an accurate and false positive free network baseline. To surmount these hurdles, network architects need to design a network free from continuous change including, changing company requirements, untested systems or application updates, and the presence of unpredictable users.

Understanding the Use of Honey Technologies Today

The aim of this study is to fill in the gaps in data on the real-world use of honey technologies. The goal has also been to better understand information security professionals’ views and attitudes towards them. While there is a wealth of academic research in cutting-edge honey technologies, there is a dearth of data related to the practical use of these technologies outside of research laboratories. The data for this research was collected via a survey which was distributed to information security professionals.

This research paper includes details on the design of the survey, its distribution, analysis of the results, insights, lessons learned and two appendices: the survey in its entirety and a summary of the data collected.

Triaging the Enterprise for Application Security Assessments

Conducting a full array of security tests on all applications in an enterprise may be infeasible due to both time and cost. According to the Center for Internet Security, the purpose of application specific and penetration testing is to discover previously unknown vulnerabilities and security gaps within the enterprise. These activities are only warranted after an organization attains significant security maturity, which results in a large backlog of systems that need testing.

When organizations finally undertake the efforts of penetration testing and application security, it can be difficult to choose where to begin. Computing environments are often filled with hundreds or thousands of different systems to test and each test can be long and costly. At this point in the testing process, little information is available about an application beyond the computers involved, the owners, data classification, and the extent to which the system is exposed. With so few variables, many systems are likely to have equal priority.

Certification and Accreditation Vs System Development Life Cycle Management

In 1987 the US Congress passed the Computer Security of 1987 Public Law (PL) 100-235. This particular law launched one of the first technological control mandates that has exploded over the years in identifying the need to control technology in the protection of the information that is processed, stored, transmitted and received in electronic form.

These protective measures have been encapsulated into various procedures such as the Certification and Accreditation of Information Systems based on various Life Cycle Management descriptions. Both expressions have the basic results with plus or minus an accurate description based on the various steps that are the same with different titles.

Disaster recovery on demand with the ClearSky global storage network

ClearSky provides customers with access to all of their data anywhere they want it, anytime they need it through an end-to-end, fully managed service. Rather than manage complex, expensive pools of storage that must be regularly upgraded and expanded, customers simply plug into the ClearSky network to get the performance, low latency and availability they need to access all their data on demand.

Built-in data protection with ClearSky

For decades, enterprise storage has been about piece parts – primary and secondary storage, backup, archival and disaster recovery storage – and trying to make these disparate systems work together. It’s an environment that’s hard to manage, over-provisioned (with capacity that has to be purchased in advance) and error-prone (with problems that cascade across multiple systems). Most of all, it’s expensive – a massive waste of physical and capital resources, and a huge, ongoing time commitment for IT.

How to define hybrid cloud

Most enterprise IT teams are under pressure from leadership to make hybrid happen – in one way or another. Driven by the public cloud’s promise of low costs and unlimited flexibility, 71 percent of organizations are now using hybrid cloud (according to lightScale). That number increased by 58 percent during the previous year – which shows that interest and adoption certainly aren’t problems.

On-demand primary storage, offsite backup and DR

Managing today’s enterprise storage is like being stuck on a treadmill that keeps going faster.

»» Data footprints are huge and keep growing. »» Businesses crave agility but today’s data protection requires costly licenses, multiple copies, secondary infrastructure. »» The cloud is an incredible resource, but latency and unpredictable performance get in the way. »» Traditional storage solutions are costly and complex.

ClearSky Data’s dedicated network simplifies the entire data lifecycle and delivers enterprise storage as a fully managed service.

Custom Chips for Dummies

Learn everything you have ever wanted to know about custom SoC/ASIC solutions and the benefits they offer.

Here’s your ticket to significantly reducing your BOM, creating ultra-low power devices and increasing product functionality. An SoC/ASIC solution is faster, easier, and for much lower-risk than you might think. This short, easy-to-read guide uses real-world examples, top tips, and watch-outs to explain how to get a custom chip design to silicon, step-by-step.


Arm will process your information in accordance with our Privacy Policy. Please visit Arm’s subscription center to manage your marketing preferences or unsubscribe from future communications.

IoT Solutions for Dummies

The Internet of Things (IoT) is rapidly disrupting markets and transforming business strategies. This book offers key insights into what organizations need to consider as they begin to undertake IoT deployments.

IoT Solutions for Dummies provides insights on:

• What is required to design a robust, secure, and scalable IoT solution
• Important challenges to consider and why it is vital to address them early
• Ten principles to follow for a successful deployment


Arm will process your information in accordance with our Privacy Policy. Please visit Arm’s subscription center to manage your marketing preferences or unsubscribe from future communications.

Protecting your office 365 data

Companies across the globe rely on the critical capabilities, flexibility and scale provided by the powerful tools that make up the Office 365 product suite. While the rapid adoption of SaaS-based applications, like Office 365, have been fueled by the unique advantages of the cloud, it is essential to note that each offering – cloud-based or otherwise – doesn’t necessarily fulfill every requirement for every customer. For instance, even though Office 365 comes in a variety of packages with different capabilities and at a wide range of price points, decision-makers must remember that the offering is intended to serve certain needs of a large, enterprise audience.

The cloud apps data protection handbook

Cloud-based applications have become critical to businesses and operations around the globe. But do leading Software as a Service (SaaS) providers such as Box, Microsoft, Google, and Salesforce protect their customers’ data with equally critical backup options? And can they recover deleted data when needed, or is it just lost?

Asking why you would want additional protection for data that’s already in the cloud is becoming a standard practice. It turns out that cloud application providers may only offer some limited levels of retention and recovery, which are primarily in place just to ensure data accessibility and save themselves and their clients from only certain types of data loss.