Last Build Date: Mon, 24 Apr 2017 17:06:41 EDTCopyright: Copyright 2017 Ulitzer.com
Mon, 24 Apr 2017 06:45:00 EDTThe visual model to declarative metadata representation to immutable deployment vision is in essence what SD is all about. The secret to making this approach practical, and thus the key to understanding why SD approaches have become so prevalent, is the word immutable. Once we get an SD approach right, we no longer have to touch the deployed technology whatsoever. Instead, to make a change, update the model and redeploy.
Sun, 23 Apr 2017 21:45:00 EDTWhile I am all for traditions like Thanksgiving turkey and Sunday afternoon football, holding onto traditions in your professional life can be career limiting. The awesome thing about careers in technology is that you constantly have to be on your front foot. Because when you’re not, someone, somewhere, will be and when you meet them, they’ll win. One tradition that has a limited lifespan at this moment is waterfall-native development and the security practices that go along with them. While the beginning of the end might have first been witnessed when Gene Kim and Josh Corman presented Security is Dead at RSA in 2012, we have more quantifiable evidence from the 2017 DevSecOps Community Survey. When asked about the maturity of DevOps practices in their organizations, 40% stated that maturity was improving, while 25% said that it was very mature across the organization or in specific pockets.
Wed, 19 Apr 2017 16:00:00 EDTRecently an analysis was completed by SmartBear to gauge the sense of what software professionals believe is the core value provided by API virtualization. It was concluded that software professionals, including developers, testers, managers and architects believe that the biggest benefit of virtualization is that it brings teams together by allowing them to collaborate. In total, 18% more respondents indicated that virtualization has more value in uniting teams than it has in adding speed to delivery or reducing costs.
Sun, 09 Apr 2017 16:51:00 EDTOfficially,virtual data rooms are defined as digital databases in which companies can store and easily share data with third-parties, usually during a business deal. In the past five years, the industry for VDR has boomed and by this year is expected to reach $1.2 trillion according to research firm IBIS.
Sat, 25 Mar 2017 17:00:00 EDTWith the complexity of today’s applications, it’s easy to end up in a situation where all of the pieces of your code aren’t ready at the same time. As a developer, you might be waiting for a third-party API to get updated, a partner organization to finish their code, or other teams in your organization to have a component ready to start testing against. This can be a drag on your organization’s entire release schedule, as testing is backed up waiting for all the pieces to be finished.
Fri, 24 Mar 2017 14:00:00 EDTThis past week HPE continued buying into server storage I/O data infrastructure technologies announcing an all cash (e.g. no stock) acquisition of Nimble Storage (NMBL). The cash acquisition for a little over $1B USD amounts to $12.50 USD per Nimble share, double what it had traded at. As a refresh, or overview, Nimble is an all flash shared storage system leverage NAND flash solid storage device (SSD) performance. Note that Nimble also partners with Cisco and Lenovo platforms that compete with HPE servers for converged systems. Earlier this year (keep in mind its only mid-March) HPE also announced acquisition of server storage Hyper-Converged Infrastructure (HCI) vendor Simplivity (about $650M USD cash). In another investment this year HPE joined other investors as part of scale out and software defined storage startups Hedvig latest funding round (more on that later). These acquisitions are in addition to smaller ones such as last years buying of SGI, not to mention various divestures.
Fri, 17 Mar 2017 16:00:00 EDTWelcome to the Cloud, Big Data, Software Defined, Bulk and Object Storage fundamentals page part of the objectstoragecenter.com micro site collection of resources. Software defined, Bulk, Cloud and Object Storage exist to support expanding and diverse application data demands. There are various types of cloud, bulk and object storage including public services such as Amazon Web Services (AWS) Simple Storage Service (S3), Google, Microsoft Azure, IBM Softlayer, Rackspace among many others. There are also solutions for hybrid and private deployment from Cisco, DDN, Dell EMC, Fujitsu, HDS, HPE, IBM, NetApp, Noobaa, OpenStack, Quantum, Rackspace, Scality, Seagate, Spectra, Storpool, Suse, Swift and WD among others.
Fri, 03 Feb 2017 16:00:00 ESTServer Virtualization has transformed the way we manage server workloads but virtualization hypervisors were not the endgame of datacenter management. What is the role of server virtualization and hypervisors in the new age of cloud, containers, and more importantly, hyperconvergence? I covered SAN technology in my last Infrastructure 101 article, so for today I'm going to cover server virtualization and maybe delve into containers and cloud. Server virtualization as we know it now is based on hypervisor technology. A hypervisor is an operating system that allows sharing of physical computing resources such as networking, CPU, RAM, and storage among multiple virtual machines (sometimes called virtual servers). Virtual machines replaced traditional physical servers that each had their own physical chassis with storage, RAM, networking, and CPU. To understand the importance of hypervisors, let's look at a bit of history.
Sun, 08 Jan 2017 12:00:00 ESTAlmost three years ago, VMware introduced the world to its virtual SAN. This new solution enabled customers to use storage within ESXi servers without the need for external storage – an exciting promise for organizations that wanted to quickly scale their virtual storage. Now, it’s time to check in on this technology and see if it’s living up to its promise. VMware became a player in the storage array and software market when it launched vSAN. Server admins were looking forward to using vSAN because it gave them a symmetrical architecture that did not require external storage, thus being able to use storage within existing servers. It also doesn’t require specialized storage skills. However, no one solution can be all things to all enterprises, and as enterprises began to deploy vSAN across their environments, they noticed something big was missing.
Sat, 31 Dec 2016 17:45:00 ESTEnterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Sun, 18 Dec 2016 16:00:00 ESTInformation self-service is undoubtedly one of the main drivers of Modern Data Management. From “data services marketplaces” to “self-service Big Data analytics,” one of the objectives of most data-related initiatives today is to provide business professionals with new ways to solve their information needs with the goals of achieving self-reliance and minimizing the IT bottleneck. However, is it realistic to expect business users to assume this job? Studies  report that more than 60 percent of companies grade their experience with self-service initiatives as “average” or lower, with nearly four out of five (73 percent) claiming that “…it requires more training than expected.” So, what is the problem and what can we do to solve it? Let’s start with the easy part: data visualization, which is the last stage of the data analysis process. Self-service BI tools have been around for some years now, allowing business data analysts to create their own graphical reports. Although those tools are not for every business user, business analysts with data experience, basic knowledge of statistics and a bit of SQL, can use them successfully.
Sat, 17 Dec 2016 11:00:00 ESTReality itself is going through a digital transformation thanks to leaps in 3D rendering and the crunch-speed motion feedback data. Although the modern definition of virtual reality (VR) has been making promises for three decades, the emphasis was always on the potential. Now it’s here. This is a tour of the state of VR in 2016 and where developers are taking it as VR spreads far beyond the world of gaming.
Thu, 08 Dec 2016 13:00:00 ESTFact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, shared success stories from a few folks who have already started using VM-aware storage. By managing storage operations at the VM-level, they’ve been able to solve their most vexing storage problems, and create infrastructures that scale to meet the needs of their applications. Best of all, they’ve got predictable, manageable storage performance – at a level conventional storage can’t match.
Sat, 19 Nov 2016 14:00:00 ESTFlash storage has become a mainstream technology, with 451 Research expecting the market to reach $9.6 billion by 2020. As the technology becomes less cost-prohibitive, and benefits such as its exponentially greater performance capabilities and simplified process for provisioning and optimizing systems become more sought after, it’s clear that the future of storage is flash. But while some organizations may have taken advantage of the burgeoning technology’s benefits early on, a significant number of companies have yet to make the transition. Your organization may very well fall into this category.
Thu, 17 Nov 2016 17:00:00 ESTVMware configurations designed to provide high availability often make it difficult to achieve satisfactory performance required by mission-critical SQL Server applications. But what if it were possible to have both high availability and high performance without the high cost and complexity normally required? This article explores two requirements for getting both for SQL applications, while reducing capital and operational expenditures. The first is to implement a storage architecture within VMware environments designed for both high availability and high performance; the second involves tuning the high availability (HA) and high performance (HP) HA/HP architecture for peak performance.
Mon, 31 Oct 2016 13:15:00 EDTWhat happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the lifecycle of people, systems and devices, transportation and fleet services are at risk of having connected, yet disparate systems.
Sat, 17 Sep 2016 14:00:00 EDTWhen it comes to IT infrastructure, there are some big differences in the needs of the SMB vs the enterprise. What might be minor hiccups in the enterprise can be major challenges in the SMB. What are these differences and how should they affect the way solutions are provided?
Fri, 05 Aug 2016 14:00:00 EDTThis is part one of a two-part series of posts about using some common server storage I/O benchmark tools and workload scripts. View part II here which includes the workload scripts and where to view sample results. There are various tools and workloads for server I/O benchmark testing, validation and exercising different storage devices (or systems and appliances) such as Non-Volatile Memory (NVM) flash Solid State Devices (SSDs) or Hard Disk Drives (HDD) among others.
Sat, 07 May 2016 17:00:00 EDTIncumbent storage vendors such as EMC, Netapp, and Nutanix have built their rich code base on the block layer. This means that the only viable option to use Persistent Memory as a storage tier is to wrap it up with an abstraction layer that will present it as a block device.
Fri, 22 Apr 2016 12:15:00 EDTFor those involved in data management or data infrastructures, the following are five tips to help cut the overhead and resulting impact of digital e-waste and later physical e-waste. Most conversations involving e-waste focus on the physical aspects from disposing of electronics along with later impacts. While physical e-waste is an important topic, lets expand the conversation including other variations of e-waste including digital. By digital e-waste I'm referring to the use of physical items that end up contributing to traditional e-waste.
Sun, 27 Mar 2016 20:00:00 EDTContainers are rapidly rushing to the fore. They’re the darling du jour of DevOps and it’s a rare conversation on microservices that doesn’t invoke it’s BFF, containers. SDx Central’s recent report on containers found only 17% of respondents that were not considering containers at all. That’s comparable with Kubernetes’ State of the Container World Jan 2016 assertion that 71% of folks were actively using containers, though Kubernetes’ found a much higher percentage of those who say they’re running containers in production (50%) than SDx found (7%).
Sun, 13 Mar 2016 11:00:00 EDTThe business dictionary defines efficiency as the comparison of what is actually produced or performed with what can be achieved with the same consumption of resources (money, time, labor, design, etc.) – Example being : The designers needed to revise the product specifications as the complexity of its parts reduced the efficiency of the product.
Wed, 03 Feb 2016 13:30:00 ESTNeed to test a server, storage I/O networking, hardware, software, services, cloud, virtual, physical or other environment that is either doing some form of file processing, or, that you simply want to have some extra workload running in the background for what ever reason? Here's a quick and relatively easy way to do it with Vdbench (Free from Oracle). Granted there are other tools, both for free and for fee that can similar things, however we will leave those for another day and post. Here's the con to this approach, there is no Uui Gui like what you have available with some other tools Here's the pro to this approach, its free, flexible and limited by your creative, amount of storage space, server memory and I/O capacity.
Tue, 19 Jan 2016 08:01:00 ESTActifio has announced the general availability of Actifio Global Manager (AGM), a web-scale data virtualization solution delivering instant access and radically simple management of application data for business resiliency and test data management across private, public, and hybrid cloud environments. Over the last 6 years, the Actifio copy data virtualization platform has been deployed in many of the world's largest and most complex enterprise IT organizations and Managed Service Providers (MSPs). It has scaled up to thousands of application instances associated with petabytes of data deployed across private data centers, and hybrid or public cloud environments including Amazon AWS. After an extensive early access program, Actifio has released AGM for general availability for these web-scale environments, delivering Actifio's trademark capabilities of instant application data access, for even very large database instances, all driven by Service Level Agreements (SLAs) extending across the full lifecycle of data from production to retirement.d cloud environments.
Wed, 06 Jan 2016 10:30:00 ESTFrom the SD Times March Madness Tournament, to the list of new research from voke, Forrester, and Gartner, to the most crowded sessions at key software testing conferences, service virtualization was a hot topic throughout 2015. Out of the 129 white papers, articles, videos, and case studies on Parasoft's service virtualization resource center, these 10 were the most popular in 2015.
Tue, 08 Dec 2015 08:01:00 ESTSUSE® has joined the Open Platform for NFV (OPNFV) project, a carrier-grade, integrated open source platform that is accelerating the introduction of new products and services using network functions virtualization (NFV). The addition of NFV capabilities enhances SUSE's software-defined data center offerings, including OpenStack-based cloud infrastructure and Ceph-based software-defined storage. "SUSE is extending what we've been doing for years in the mission-critical compute, OpenStack cloud and enterprise storage spaces, bringing carrier-grade technology and service to the software-defined data center," said Nils Brauckmann, president and general manager of SUSE. "Our engagement with the OPNFV project as a platinum member will help accelerate the NFV platform for partners and customers alike."
Thu, 12 Nov 2015 17:00:00 ESTEMC Corporation, the world leader in information management and storage, today announced that it has been positioned by Gartner, Inc. in the 'Leaders' quadrant in the 'Magic Quadrant for Enterprise Content Management, 2005'(1) report. Gartner Inc.'s Magic Quadrant positioned EMC as an enterprise content management (ECM) leader based on the completeness of its vision and ability to execute that vision. Gartner describes companies listed in the 'Leaders' quadrant as performing well today, having a clear vision of market direction and actively building competencies to sustain their leadership position in the market.
Fri, 30 Oct 2015 22:00:00 EDTOur guest on the podcast this week is Helen Beal, Head of DevOps at Ranger4 Limited. We discuss how successful DevOps transitions depend on culture, so to start companies must identify their current problem areas. Helen describes the most successful DevOps culture as a place where each individual has autonomy as part of the larger team and where experimentation is encouraged.
Fri, 30 Oct 2015 10:00:00 EDTConnected things, systems and people can provide information to other things, systems and people and initiate actions for each other that result in new service possibilities. By taking a look at the impact of Internet of Things when it transitions to a highly connected services marketplace we can understand how connecting the right “things” and leveraging the right partners can provide enormous impact to your business’ growth and success. In her general session at @ThingsExpo, Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, discussed how this exciting emergence of layers of service offerings across a growing partner ecosystem can be monetized for the benefit of smart digital citizens, enterprises and society.
Wed, 28 Oct 2015 10:00:00 EDTOvergrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction. The problem is there are a lot of moving parts in these designs; this makes assuring performance complex especially if the services are geographically distributed or provided by multiple third parties.
Thu, 15 Oct 2015 16:50:00 EDTWant hands-on experience with service virtualization—one of the most exciting new software testing technologies in years? Then don't miss this free Service Virtualization certification program led by Parasoft: the company who pioneered service virtualization in 2002. After spending 2 hours with top Software Evangelist Arthur "Code Curmudgeon" Hicken, you'll have a core understanding of how your team can use service virtualization to test earlier, faster, and more completely.
Sat, 10 Oct 2015 10:45:00 EDTClutch is now a Docker Authorized Consulting Partner, having completed Docker's certification course on the "Docker Accelerator for CI Engagements." More info about Clutch's success implementing Docker can be found here. Docker is an open platform for developers and system administrators to build, ship and run distributed applications. With Docker, IT organizations shrink application delivery from months to minutes, frictionlessly move workloads between data centers and the cloud and achieve 20x greater efficiency in their use of computing resources. Inspired by an active community and transparent, open source innovation, Docker containers have been downloaded more than 800 million times. Docker also provides enterprise subscriptions that deliver the software, support and maintenance organizations need to deploy a Dockerized application environment.
Wed, 07 Oct 2015 10:00:00 EDTSYS-CON Events announced today that Interface Masters Technologies, provider of leading network visibility and monitoring solutions, will exhibit at the 17th International CloudExpo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Interface Masters Technologies is a leading provider of high speed networking solutions focused on Gigabit, 10 Gigabit, 40 Gigabit and 100 Gigabit Ethernet network access and connectivity products. For over 20 years, the company has been providing innovative networking solutions with customization services to OEMs, large enterprises and sophisticated end users. Interface Masters has been an OCP member contributing multiple white box designs to the project while supporting customer SDN development.
Sat, 19 Sep 2015 13:00:00 EDTCommunications cooperative HTC centralizes storage management to gain powerful visibility, reduce costs, and implement IT disaster avoidance capabilities. We’ll learn more about how HTC lowers total storage utilization cost while bringing in a common management view to improve problem resolution, automate resources allocation, and more fully gain compliance -- as well as set the stage for broader virtualization and business continuity benefits.
Thu, 03 Sep 2015 13:00:00 EDTIoT is the next development of how the Internet is applied to the world. TAM for M2M/IoT is estimated at $19 trillion. The IoT device count is in the billions but will not traverse the service providers’ networks. Service providers and vendors are struggling to understand how to map the TAM dollars to real use cases, optimal technology approaches, and profitable business models. In his session at @ThingsExpo, Dennis Ward, IoT analyst, strategist at DWE, will focus on the SP transformations that will occur. In Phase I SP infrastructure virtualization. In Phase II SPs will focus on monetization. The key is IoT Cloud-based Service Centric Use cases.
Sun, 30 Aug 2015 03:00:00 EDTCoolect, LLC, innovators of digital media management software, announced today the release of a new personal data management software product, Coolect, setting a new benchmark for the collecting, organizing, storing and displaying of digital media.
Sun, 30 Aug 2015 03:00:00 EDTFileMaker today announced the immediate availability of FileMaker Pro 8, the newest version of the most-awarded desktop database, featuring new ways to work faster, share and manage information of all types, and be more productive.
Thu, 27 Aug 2015 09:00:00 EDTWhen I started exploring virtualization, like many folks, I was in awe of how much efficiency came with moving physical servers into VMs. To this day, the number of success stories about improved usage, reduced overhead costs and increased functionality makes virtualization a solid business model for IT folks. Then I learned about Containerization and well, it takes efficiency to another level.
Wed, 19 Aug 2015 15:00:00 EDTWe really are moving in the direction of truly commoditized hardware. Some uses will always have specific requirements that are not mainstream and thus will require specialized builds, this is true in every industry. But increasingly, who made your hardware and where they got their parts from is a secondary issue. Which makes one consider what really sells hardware these days. Years ago when I was working for Network Computing, I reviewed a low-end blade server company capable of cranking up blades at a fraction of the cost of most vendors. They (like far too many good companies) ran out of money before they could grab market traction, but they did show that it could be done at a price even small enterprises could afford.
Fri, 14 Aug 2015 11:00:00 EDTThe amount of data processed in the world doubles every three years and a global commitment to open source technology is the way to handle this growth. An open technology approach fosters innovation through massive community involvement and impedes expensive vendor lock-in. This benefits buyers as markets remain more competitive. In doing so, open standards and technologies also allow for market hypergrowth, and this is the key to handling the growth of data. A doubling every three years means we'll be grappling with a full Yottabyte of data in the year 2040. That's one billion petabytes, an amount of data that, similar to pondering geologic time, I can understand in the abstract but not truly grasp. Meanwhile, the nature of this data—which can truly be called Big Data in today's age of the Zettabyte—is transforming from a jet plane model to a chewing gum model. By this I mean Big Data in its original conception 20 years ago referred to a small number of massive files, the type found in meteorology and nuclear-bomb building. Tomorrow's Big Data will largely be a product of billions of sensors, transmitting less than 10K at a time. Rather than thinking about a few 747s, we'll be thinking about billions of pieces of chewing gum. Already There We're already in such a Little Big Data era, with stuff like Hadoop and NoSQL databases equipped to handle the onslaught of data volume, variety, and--because of much this data's real-time nature—velocity. But we've only just begun. Innovation must continue apace, in hardware even more so than software. Today's technology already requires more almost 3% of the world's electricity grid to power its data centers—exponential increases in data processing simply will not be met by the global electricity grid in the absence of vast new hardware efficiencies. The OPG Thus I'm involved with something called the Open Performance Grid, or OPG. Announced in San Francisco in August 2015, the Open Performance Grid measures openness, performance, and leadership of hardware, software, and designs for modern data centers. The OPG is a community effort with input from technology users and buyers, analysts and researchers, and vendors who wish to compare their own self-assessments with what the community is saying. Sample measures of openness, beyond simple open-source availability, include the presence, size, and activity of a community and foundation for a particular technology. Market share, benchmark performance, and what we call the Innovation Curve are also part of the mix. Software categories include operating systems, virtualization, containers, PaaS, IaaS services and stacks, monitoring/analytics, management consoles, software-defined storage, SDN, SDDC. For hardware, we're looking at chips, boards, subsystems, and even overall data center designs. The challenge of meeting the astounding growth in data is enormous. The Open Performance Grid is a way to encourage and enable the technology provider, development, and user commu[...]
Wed, 05 Aug 2015 11:00:00 EDTServices providers have traditionally organized the management and operation of different technologies into several teams with very specific domain knowledge. These teams have been staffed with specialists looking after routing, network services, servers, virtualization, storage area networks, security and various other technology domains. Over time, these functional teams have had the tendency to develop into loosely tied silos.
Wed, 22 Jul 2015 14:45:00 EDTMobile devices. Cloud-based services. The Internet of Things. What do all of these trends have in common? They are some of the factors driving the unprecedented growth of data today. And where data grows, so does the need for data storage. The traditional method of buying more hardware is cost-prohibitive at the scale needed. As a result, a new storage paradigm is required. Enterprises today need flexible, scalable storage approaches if they hope to keep up with rising data demands. Software-defined storage (SDS) offers the needed flexibility. In light of the varied storage and compute needs of organizations, two SDS options have arisen: hyperconverged and hyperscale. Each approach has its distinctive features and benefits, which are explored below.
Tue, 21 Jul 2015 11:00:00 EDTI’ve always had a fascination with the way information is acquired and process. Reading back through the history of this site, you can see this tendency towards more fanciful thinking, e.g. GPGPU assisted network analytics, future storage systems using Torrenza-style processing. What has once been theory has made its way into the realm of praxis; looking no further than ICML 2015, for example, to see the forays into DML that nVidia is making with their GPUs. And on the story goes. Having said all this, there are elements of data, of data networking, of data processing, which, to date, have NOT gleaned all the benefits of this type of acceleration. To that end, what I am going to attempt to posit today is an area where Neural Networking (or at least the benefits therein) can be usefully applied to an area interacted with every single nanosecond of every day: the network.
Tue, 02 Jun 2015 22:30:00 EDTSYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Thu, 14 May 2015 04:45:00 EDTSYS-CON Events announced today that WSM International (WSM), the world’s leading cloud and server migration services provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. WSM is a solutions integrator with a core focus on cloud and server migration, transformation and DevOps services.
Sun, 10 May 2015 15:30:00 EDTThe Citrix X1 Mouse dramatically improves the user experience of any remote Windows app or desktop delivered to an iPad via Citrix and makes anyone more productive. At Citrix, we’ve been helping people access and use business apps on any device for years. Yet many of our customers depend on Windows-based applications that are hard to use on iPad and Android tablets, because so many features depend on the point-and-click simplicity and accuracy of a physical mouse.
Tue, 05 May 2015 23:00:00 EDTWe heard for many years how developing nations would be able to develop mobile-phone networks quickly, perhaps even leapfrog developed nations, because their lack of traditional, wired networks would not inhibit them from deploying the new technology. Now there is talk of history repeating itself with the Industrial Internet--a key aspect of the emerging Internet of Things. For example, Guo Ping, Deputy Chairman of the Board of Chinese electronics giant Huawei, said in a recent report from the World Economic Forum, "The Industrial Internet will afford emerging markets a unique opportunity to leapfrog developed countries in digital infrastructure," says a guy from Chinese giant Huawei in this report. To some degree the first prediction turned out to be true, as mobile communications have become well established in many developing countries, and mobile phones the first phones ever used by perhaps 2 billion people. Our ongoing research at the Tau Institute shows that, indeed, developing nations in several regions are the most dynamic among all nations of the world. Unleashing Potential Now, with the Industrial Internet, no less a pontificator than Salesforce CEO Marc Benioff pronounced the IoT “ground zero for a new phase of global transformation...reshaping industries,” in the same WEF report. This particular report, entitled “Industrial Internet of Things: Unleashing the Potential of Connected Products and Services,” cites operational efficiency, connected ecosystems, software platforms, collaboration between humans and machines, and something called the outcome economy as the key opportunities afforded by the Industrial Internet. (“Outcome economy” is some mumbo-jumbo invented by the report’s collaborator, Accenture, and seems to mean that feedback from the IoT will provide companies with new insights that let them create products and services that will better meet customers’ outcomes. Perhaps pharmaceutical companies in the past, for example, were unclear that their customers wanted to feel better.) In any case, the touted new efficiencies of the IoT in general and the Industrial(ized) Internet in particular do seem to hold promise to bring new productivity--and if history is a guide, economic growth--to nations that move toward the IoT aggressively. Healthy Growth Economic growth without increased economic parity and social development will be the empty calories of this new global development engine: if bigger just means fatter, then nations will be hurting themselves over the long term. This is one of our concerns about recent economic growth in the Philippines, for example. It’s widely reported that the administration of President Noynoy Aquino--which runs from 2010-2016 in the country’s single-term presidential system--has produced rapid economic growt[...]
Sat, 02 May 2015 21:00:00 EDTThere are more mobile phones in the Philippines than people. And there are a lot of people. This is one of the amazing statistics of our current era, in which the compulsion for humans to communicate is leading us into the realm of massive data flows in an increasingly interconnected world. The Philippine phenomenon is due in part to the presence of two dominant mobile carriers–and a third that nips strongly at their heels—who charge extra fees for texts and calls outside of their networks. About 96% of this traffic is from prepaid traffic, in which users “load” up their phones from ubiquitous small stores in increments of less than one US dollar. Similar noteworthy statistics are found elsewhere in Southeast Asia. Indonesia, for example, was sending out 385 tweets per second in 2013, grabbing 7.5% of global Twitter traffic. Other strong social-media numbers have led to many there referring to their country as “the social media capital of the world.” With overall wired Internet access still lacking, more than 60% of Indonesia's social traffic is mobile. Thailand claims 97% of its population on social media, with prepaid SIM card system like the Philippines, and easy roaming throughout neighboring countries. Malaysia has a higher average income than most of its neighbors, and Vietnam a lower one, but both also contribute to an Asian average of more than 360MB of data use per month on mobile devices. Singapore is of course the great economic power of the region, with a per-person income level that now surpasses that of the United States. Singapore has among the fastest Internet access in the world and is moving toward being a Smart City through use of IoT technology. The average mobile data use in Asia is more than three times that of North America, almost 20 times that of Europe, and 200 times that of Africa. As I noted above, amazing. Heat, Noise, and Startups The hyperkinetic nature of Southeast Asian nations—the traffic, noise, masses of people, and heat can easily overwhelm on a short-term basis and grind one down over the long term—is reflected in recent economic growth through the region. Our research at the Tau Institute shows the region to be the most dynamic in the world, even as clear infrastructure problems are apparent everywhere outside of Singapore. This energy is also reflected in a growing culture of startups and innovation. I recently attended a startup competition in Manila, which is part of a larger event called the Top 100 program. The Top 100 program culminates in Singapore June 23-24, where 100 companies from a field of 300 among 14 Asian nations will compeete for attention from investors. The competition goes beyond Southeast Asia, with teams from India, Bangladesh, Kazakhstan, Taiwan, Japan, and South Kore[...]
Fri, 10 Apr 2015 16:36:47 EDTIn the electric utility business, structural change is afoot. If you are watching this all unfold in the regulatory hearing room, legislative chamber or the trade press you are familiar with the players in this drama. You have the "technarians at the gate", SolarCity and its cousin Tesla Motors, numerous other residential solar suppliers and advocates calling for the shift to a distributed renewable driven grid. On the other side is the establishment, the investor owned utilities who own and operate the electric grid that serve the masses. The technarians seem to be the more modern, with the times contingent while the utilities are viewed as stodgy, unwilling to change and obstructionist.
Thu, 09 Apr 2015 14:45:00 EDTWhat if you could enjoy solid state drive speeds without upgrading your storage area networks? What if there was a solution that didn't require a disruptive overhaul of your data center? That's the proposition being put forward by Cirrus Data with its newly launched Data Caching Server (DCS) solution. “Our new caching server and service represents a revolutionary way to address the need for speed for database administrators and a unique offering for service providers as the appliances can be installed and removed in a live production environment,” says Wayne Lam, CEO of Cirrus Data.