Tue, 24 May 2016 22:11:25 GMT
I’ll be in the UK next week presenting at the free AzureCraft event being held on June 3rd and 4th. This event was created by the UK Azure User Group and is a great way to learn about Azure as well as engage with the Azure community in the UK.
I’ll be speaking on June 3rd from 9:30-11:30am on “What’s new in Azure”. It is going to have a lot of new content and highlight some of the cool new services and capabilities in Azure that developers might not have had a chance yet to try out (while at the same time being understandable even to people who have never used Azure before). Some of the topics + demos I’m planning to cover include:
After my talk there are a bunch of additional talks throughout the day that will then go into even more depth on different Azure topics.
You can register for the event for free here. The event on June 3rd (where I’m speaking) is being held at the Mermaid Puddle Dock, Blackfriars London. On June 4th there is then a great set of additional talks + workshops being held at the Microsoft Reading campus.
You can learn more about the overall event on the Azurecraft.uk web-site.
Hope to see you there!
Wed, 24 Feb 2016 18:56:58 GMT
As the role of mobile devices in people's lives expands even further, mobile app developers have become a driving force for software innovation. At Microsoft, we are working to enable even greater developer innovation by providing the best experiences to all developers, on any device, with powerful tools, an open platform and a global cloud.
As part of this commitment I am pleased to announce today that Microsoft has signed an agreement to acquire Xamarin, a leading platform provider for mobile app development.
In conjunction with Visual Studio, Xamarin provides a rich mobile development offering that enables developers to build mobile apps using C# and deliver fully native mobile app experiences to all major devices – including iOS, Android, and Windows. Xamarin’s approach enables developers to take advantage of the productivity and power of .NET to build mobile apps, and to use C# to write to the full set of native APIs and mobile capabilities provided by each device platform. This enables developers to easily share common app code across their iOS, Android and Windows apps while still delivering fully native experiences for each of the platforms. Xamarin’s unique solution has fueled amazing growth for more than four years.
Xamarin has more than 15,000 customers in 120 countries, including more than one hundred Fortune 500 companies - and more than 1.3 million unique developers have taken advantage of their offering. Top enterprises such as Alaska Airlines, Coca-Cola Bottling, Thermo Fisher, Honeywell and JetBlue use Xamarin, as do gaming companies like SuperGiant Games and Gummy Drop. Through Xamarin Test Cloud, all types of mobile developers—C#, Objective-C, Java and hybrid app builders —can also test and improve the quality of apps using thousands of cloud-hosted phones and devices. Xamarin was recently named one of the top startups that help run the Internet.
Microsoft has had a longstanding partnership with Xamarin, and have jointly built Xamarin integration into Visual Studio, Microsoft Azure, Office 365 and our Enterprise Mobility Suite to provide developers with an end-to-end workflow for native, secure apps across platforms. We have also worked closely together to offer the training, tools, services and workflows developers need to succeed.
With today’s acquisition announcement we will be taking this work much further to make our world class developer tools and services even better with deeper integration and enable seamless mobile app dev experiences. The combination of Xamarin, Visual Studio, Visual Studio Team Services, and Azure delivers a complete mobile app dev solution that provides everything a developer needs to develop, test, deliver and instrument mobile apps for every device. We are really excited to see what you build with it.
We are looking forward to providing more information about our plans in the near future – starting at the Microsoft //Build conference coming up in a few weeks, followed by Xamarin Evolve in late April. Be sure to watch my Build keynote and get a front row seat at Evolve to learn more!
Thu, 01 Oct 2015 05:43:17 GMTYesterday we held our AzureCon event and were fortunate to have tens of thousands of developers around the world participate. During the event we announced several great new enhancements to Microsoft Azure including: General Availability of 3 new Azure regions in India Announcing new N-series of Virtual Machines with GPU capabilities Announcing Azure IoT Suite available to purchase Announcing Azure Container Service Announcing Azure Security Center We were also fortunate to be joined on stage by several great Azure customers who talked about their experiences using Azure including: Jet.com, Nascar, Alaska Airlines, Walmart, and ThyssenKrupp. Watching the Videos All of the talks presented at AzureCon (including the 60 breakout talks) are now available to watch online. You can browse and watch all of the sessions here. My keynote to kick off the event was an hour long and provided an end-to-end look at Azure and some of the big new announcements of the day. You can watch it here. Below are some more details of some of the highlights: Announcing General Availability of 3 new Azure regions in India Yesterday we announced the general availability of our new India regions: Mumbai (West), Chennai (South) and Pune (Central). They are now available for you to deploy solutions into. This brings our worldwide presence of Azure regions up to 24 regions, more than AWS and Google combined. Over 125 customers and partners have been participating in the private preview of our new India regions. We are seeing tremendous interest from industry sectors like Public Sector, Banking Financial Services, Insurance and Healthcare whose cloud adoption has been restricted by data residency requirements. You can all now deploy your solutions too. Announcing N-series of Virtual Machines with GPU Support This week we announced our new N-series family of Azure Virtual Machines that enable GPU capabilities. Featuring NVidia’s best of breed Tesla GPUs, these Virtual Machines will help you run a variety of workloads ranging from remote visualization to machine learning to analytics. The N-series VMs feature NVidia’s flagship GPU, the K80 which is well supported by NVidia’s CUDA development community. N-series will also have VM configurations featuring the latest M60 which was recently announced by NVidia. With support for M60, Azure becomes the first hyperscale cloud provider to bring the capabilities of NVidia’s Quadro High End Graphics Support to the cloud. In addition, N-series combines GPU capabilities with the superfast RDMA interconnect so you can run multi-machine, multi-GPU workloads such as Deep Learning and Skype Translator Training. Announcing Azure Security Center This week we announced the new Azure Security Center—a new Azure service that gives you visibility and control of the security of your Azure resources, and helps you stay ahead of threats and attacks. Azure is the first cloud platform to provide unified security management with capabilities that help you prevent, detect, and respond to threats. The Azure Security Center provides a unified view of your security state, so your team and/or your organization’s security specialists can get the information they need to evaluate risk across the workloads they run in the cloud. Based on customizable policy, the service can provide recommendations. For example, the policy might be that all web applications should be protected by a web application firewall. If so, the Azure Security Center will automatically detect when web apps you host in Azure don’t have a web application firewall configured, and provide a quick and direct workflow to get a firewall from one of our partners deployed and configured: Of course, even with the best possible protection in place, attackers will still try to compromise systems. To address this problem and adopt an “assume breach” mindset, the Azure Security Center uses advanced analytics, including machine learning, along[...]
Mon, 28 Sep 2015 20:54:24 GMTToday, I’m happy to announce several key additions to our big data services in Azure, including the General Availability of HDInsight on Linux, as well as the introduction of our new Azure Data Lake and Language services. General Availability of HDInsight on Linux Today we are announcing general availability of our HDInsight service on Ubuntu Linux. HDInsight enables you to easily run managed Hadoop clusters in the cloud. With today’s release we now allow you to configure these clusters to run using both a Windows Server Operating System as well as an Ubuntu based Linux Operating System. HDInsight on Linux enables even broader support for Hadoop ecosystem partners to run in HDInsight providing you even greater choice of preferred tools and applications for running Hadoop workloads. Both Linux and Windows clusters in HDInsight are built on the same standard Hadoop distribution and offer the same set of rich capabilities. Today’s new release also enables additional capabilities, such as, cluster scaling, virtual network integration and script action support. Furthermore, in addition to Hadoop cluster type, you can now create HBase and Storm clusters on Linux for your NoSQL and real time processing needs such as building an IoT application. Create a cluster HDInsight clusters running using Linux can now be easily created from the Azure Management portal under the Data + Analytics section. Simply select Ubuntu from the cluster operating system drop-down, as well as optionally choose the cluster type you wish to create (we support base Hadoop as well as clusters pre-configured for workloads like Storm, Spark, HBase, etc). All HDInsight Linux clusters can be managed by Apache Ambari. Ambari provides the ability to customize configuration settings of your Hadoop cluster while giving you a unified view of the performance and state of your cluster and providing monitoring and alerting within the HDInsight cluster. Installing additional applications and Hadoop components Similar to HDInsight Windows clusters, you can now customize your Linux cluster by installing additional applications or Hadoop components that are not part of default HDInsight deployment. This can be accomplished using Bash scripts with script action capability. As an example, you can now install Hue on an HDInsight Linux cluster and easily use it with your workloads: Develop using Familiar Tools All HDInsight Linux clusters come with SSH connectivity enabled by default. You can connect to the cluster via a SSH client of your choice. Moreover, SSH tunneling can be leveraged to remotely access all of the Hadoop web applications from the browser. New Azure Data Lake Services and Language We continue to see customers enabling amazing scenarios with big data in Azure including analyzing social graphs to increase charitable giving, analyzing radiation exposure and using the signals from thousands of devices to simulate ways for utility customers to optimize their monthly bills. These and other use cases are resulting in even more data being collected in Azure. In order to be able to dive deep into all of this data, and process it in different ways, you can now use our Azure Data Lake capabilities – which are 3 services that make big data easy. The first service in the family is available today: Azure HDInsight, our managed Hadoop service that lets you focus on finding insights, and not spend your time having to manage clusters. HDInsight lets you deploy Hadoop, Spark, Storm and HBase clusters, running on Linux or Windows, managed, monitored and supported by Microsoft with a 99.9% SLA. The other two services, Azure Data Lake Store and Azure Data Lake Analytics introduced below, are available in private preview today and will be available broadly for public usage shortly. Azure Data Lake Store Azure Data Lake Store is a hyper-scale HDFS repository designed specifically for big data analytics workloads in the cloud. Azure Data Lake Store solves th[...]
Mon, 28 Sep 2015 03:35:06 GMT
This Tuesday, Sept 29th, we are hosting our online AzureCon event – which is a free online event with 60 technical sessions on Azure presented by both the Azure engineering team as well as MVPs and customers who use Azure today and will share their best practices.
I’ll be kicking off the event with a keynote at 9am PDT. Watch it to learn the latest on Azure, and hear about a lot of exciting new announcements. We’ll then have some fantastic sessions that you can watch throughout the day to learn even more.
Hope to see you there!
Wed, 23 Sep 2015 20:41:03 GMT
A few weeks ago, we announced the preview availability of the new Basic and Premium Elastic Database Pools Tiers with our Azure SQL Database service. Elastic Database Pools enable you to run multiple, isolated and independent databases that can be auto-scaled automatically across a private pool of resources dedicated to just you and your apps. This provides a great way for software-as-a-service (SaaS) developers to better isolate their individual customers in an economical way.
Today, we are announcing some nice changes to the pricing structure of Elastic Database Pools as well as changes to the density of elastic databases within a pool. These changes make it even more attractive to use Elastic Database Pools to build your applications.
Specifically, we are making the following changes:
Below are the updated parameters for each of the Elastic Database Pool options with these new changes:
For more information about Azure SQL Database Elastic Database Pools and Management tools go the technical overview here.
Hope this helps,
Wed, 02 Sep 2015 17:51:22 GMTToday, we’re announcing the release of the new Azure GS-series of Virtual Machine sizes, which enable Azure Premium Storage to be used with Azure G-series VM sizes. These VM sizes are now available to use in both our US and Europe regions. Earlier this year we released the G-series of Azure Virtual Machines – which provide the largest VM size provided by any public cloud provider. They provide up to 32-cores of CPU, 448 GB of memory and 6.59 TB of local SSD-based storage. Today’s release of the GS-series of Azure Virtual Machines enables you to now use these large VMs with Azure Premium Storage – and enables you to perform up to 2,000 MB/sec of storage throughput , more than double any other public cloud provider. Using the G5/GS5 VM size now also offers more than 20 gbps of network bandwidth, also more than double the network throughout provided by any other public cloud provider. These new VM offerings provide an ideal solution to your most demanding cloud based workloads, and are great for relational databases like SQL Server, MySQL, PostGres and other large data warehouse solutions. You can also use the GS-series to significantly scale-up the performance of enterprise applications like Dynamics AX. The G and GS-series of VM sizes are available to use now in our West US, East US-2, and West Europe Azure regions. You’ll see us continue to expand availability around the world in more regions in the coming months. GS Series Size Details The below table provides more details on the exact capabilities of the new GS-series of VM sizes: Size Cores Memory Max Disk IOPS Max Disk Bandwidth (MB per second) Standard_GS1 2 28 5,000 125 Standard_GS2 4 56 10,000 250 Standard_GS3 8 112 20,000 500 Standard_GS4 16 224 40,000 1,000 Standard_GS5 32 448 80,000 2,000 Creating a GS-Series Virtual Machine Creating a new GS series VM is very easy. Simply navigate to the Azure Preview Portal, select New(+) and choose your favorite OS or VM image type: Click the Create button, and then click the pricing tier option and select “View All” to see the full list of VM sizes. Make sure your region is West US, East US 2, or West Europe to select the G-series or the GS-Series: When choosing a GS-series VM size, the portal will create a storage account using Premium Azure Storage. You can select an existing Premium Storage account, as well, to use for the OS disk of the VM: Hitting Create will launch and provision the VM. Learn More If you would like more information on the GS-Series VM sizes as well as other Azure VM Sizes then please visit the following page for additional details: Virtual Machine Sizes for Azure. For more information on Premium Storage, please see: Premium Storage overview. Also, refer to Using Linux VMs with Premium Storage for more details on Linux deployments on Premium Storage. Hope this helps, Scott[...]
Thu, 27 Aug 2015 16:13:09 GMTToday we are making available several new SQL Database capabilities in Azure that enable you to build even better cloud applications. In particular: We are introducing two new pricing tiers for our Elastic Database Pool capability. Elastic Database Pools enable you to run multiple, isolated and independent databases on a private pool of resources dedicated to just you and your apps. This provides a great way for software-as-a-service (SaaS) developers to better isolate their individual customers in an economical way. We are also introducing new higher-end scale options for SQL Databases that enable you to run even larger databases with significantly more compute + storage + networking resources. Both of these additions are available to start using immediately. Elastic Database Pools If you are a SaaS developer with tens, hundreds, or even thousands of databases, an elastic database pool dramatically simplifies the process of creating, maintaining, and managing performance across these databases within a budget that you control. A common SaaS application pattern (especially for B2B SaaS apps) is for the SaaS app to use a different database to store data for each customer. This has the benefit of isolating the data for each customer separately (and enables each customer’s data to be encrypted separately, backed-up separately, etc). While this pattern is great from an isolation and security perspective, each database can end up having varying and unpredictable resource consumption (CPU/IO/Memory patterns), and because the peaks and valleys for each customer might be difficult to predict, it is hard to know how much resources to provision. Developers were previously faced with two options: either over-provision database resources based on peak usage--and overpay. Or under-provision to save cost--at the expense of performance and customer satisfaction during peaks. Microsoft created elastic database pools specifically to help developers solve this problem. With Elastic Database Pools you can allocate a shared pool of database resources (CPU/IO/Memory), and then create and run multiple isolated databases on top of this pool. You can set minimum and maximum performance SLA limits of your choosing for each database you add into the pool (ensuring that none of the databases unfairly impacts other databases in your pool). Our management APIs also make it much easier to script and manage these multiple databases together, as well as optionally execute queries that span across them (useful for a variety operations). And best of all when you add multiple databases to an Elastic Database Pool, you are able to average out the typical utilization load (because each of your customers tend to have different peaks and valleys) and end up requiring far fewer database resources (and spend less money as a result) than you would if you ran each database separately. The below chart shows a typical example of what we see when SaaS developers take advantage of the Elastic Pool capability. Each individual database they have has different peaks and valleys in terms of utilization. As you combine multiple of these databases into an Elastic Pool the peaks and valleys tend to normalize out (since they often happen at different times) to require much less overall resources that you would need if each database was resourced separately: Because Elastic Database Pools are built using our SQL Database service, you also get to take advantage of all of the underlying database as a service capabilities that are built into it: 99.99% SLA, multiple-high availability replica support built-in with no extra charges, no down-time during patching, geo-replication, point-in-time recovery, TDE encryption of data, row-level security, full-text search, and much more. The end re[...]
Wed, 19 Aug 2015 16:01:46 GMTAt DockerCon this year, Mark Russinovich, CTO of Microsoft Azure, demonstrated the first ever application built using code running in both a Windows Server Container and a Linux container connected together. This demo helped demonstrate Microsoft's vision that in partnership with Docker, we can help bring the Windows and Linux ecosystems together by enabling developers to build container-based distributed applications using the tools and platforms of their choice. Today we are excited to release the first preview of Windows Server Containers as part of our Windows Server 2016 Technical Preview 3 release. We’re also announcing great updates from our close collaboration with Docker, including enabling support for the Windows platform in the Docker Engine and a preview of the Docker Engine for Windows. Our Visual Studio Tools for Docker, which we previewed earlier this year, have also been updated to support Windows Server Containers, providing you a seamless end-to-end experience straight from Visual Studio to develop and deploy code to both Windows Server and Linux containers. Last but not least, we’ve made it easy to get started with Windows Server Containers in Azure via a dedicated virtual machine image. Windows Server Containers Windows Server Containers create a highly agile Windows Server environment, enabling you to accelerate the DevOps process to efficiently build and deploy modern applications. With today’s preview release, millions of Windows developers will be able to experience the benefits of containers for the first time using the languages of their choice – whether .NET, ASP.NET, PowerShell or Python, Ruby on Rails, Java and many others. Today’s announcement delivers on the promise we made in partnership with Docker, the fast-growing open platform for distributed applications, to offer container and DevOps benefits to Linux and Windows Server users alike. Windows Server Containers are now part of the Docker open source project, and Microsoft is a founding member of the Open Container Initiative. Windows Server Containers can be deployed and managed either using the Docker client or PowerShell. Getting Started using Visual Studio The preview of our Visual Studio Tools for Docker, which enables developers to build and publish ASP.NET 5 Web Apps or console applications directly to a Docker container, has been updated to include support for today’s preview of Windows Server Containers. The extension automates creating and configuring your container host in Azure, building a container image which includes your application, and publishing it directly to your container host. You can download and install this extension, and read more about it, at the Visual Studio Gallery here: http://aka.ms/vslovesdocker. Once installed, developers can right-click on their projects within Visual Studio and select “Publish”: Doing so will display a Publish dialog which will now include the ability to deploy to a Docker Container (on either a Windows Server or Linux machine): You can choose to deploy to any existing Docker host you already have running: Or use the dialog to create a new Virtual Machine running either Window Server or Linux with containers enabled. The below screen-shot shows how easy it is to create a new VM hosted on Azure that runs today’s Windows Server 2016 TP3 preview that supports Containers – you can do all of this (and deploy your apps to it) easily without ever having to leave the Visual Studio IDE: Getting Started Using Azure In June of last year, at the first DockerCon, we enabled a streamlined Azure experience for creating and managing Docker hosts in the cloud. Up until now these hosts have only run on Linux. With the new preview of Windows Server 2016 supporting Windows Server Containers, we have enabled a parallel experience for Windows users. Dir[...]
Thu, 25 Jun 2015 05:59:52 GMTOrganizations moving to the cloud can achieve significant cost savings. But to achieve the maximum benefit you need to be able to accurately track your cloud spend in order to monitor and predict your costs. Enterprises need to be able to get detailed, granular consumption data and derive insights to effectively manage their cloud consumption. I’m excited to announce the public preview release of two new Azure Billing APIs today: the Azure Usage API and Azure RateCard API which provide customers and partners programmatic access to their Azure consumption and pricing details: Azure Usage API – A REST API that customers and partners can use to get their usage data for an Azure subscription. As part of this new Billing API we now correlate the usage/costs by the resource tags you can now set set on your Azure resources (for example: you could assign a tag “Department abc” or “Project X” to a VM or Database in order to better track spend on a resource and charge it back to an internal group within your company). To get more details, please read the MSDN page on the Usage API. Enterprise Agreement (EA) customers can also use this API to get a more granular view into their consumption data, and to complement what they get from the EA Billing CSV. Azure RateCard API – A REST API that customers and partners can use to get the list of the available resources they can use, along with metadata and price information about them. To get more details, please read the MSDN page on the RateCard API. You can start taking advantage of both of these APIs today. You can write your own custom code that uses the APIs to construct your own custom reports, or alternatively you can also now take advantage of pre-built bill tracking systems provided by our partners which already integrate the APIs into their existing solutions. Partner Solutions Two of our Azure Billing partners (Cloudyn and Cloud Cruiser) have already integrated the new Billing APIs into their products: Cloudyn has integrated with Azure Billing APIs to provide IT financial management insights on cost optimization. You can read more about their integration experience in Microsoft Azure Billing APIs enable Cloudyn to Provide ITFM for Customers. Cloud Cruiser has integrated with the Azure RateCard API to provide an estimate of what it would cost the customer to run the same workloads on Azure. They are also working on integrating with the Azure Usage API to provide insights based on the Azure consumption. You can read more about their integration in Cloud Cruiser and Microsoft Azure Billing API Integration. You can adopt one or both of the above solutions immediately and use them to better track your Azure bill without having to write a single line of code. Cloudyn's integration enables you to view and query the breakdown of Azure usage by resource tags (e.g. “Dev/Test”, “Department abc”, “Project X”): Cloudyn's integration showing trend of estimated charges over time: Cloud Cruiser's integration to show estimated cost of running workload on Azure: Using the Billing APIs directly You can also use the new Billing APIs directly to write your own custom reports and billing tracking logic. To get started with the APIs, you can leverage the code samples on Github. The Billing APIs leverage the new Azure Resource Manager and use Azure Active Directory for Authentication and follow the Azure Role-based access control policies. The code samples we’ve published show a variety of common scenarios and how to integrate this logic end to end. Summary The new Azure Billing APIs make it much easier to track your bill and save money. As always, please reach out to us on the Azure Feedback forum and through the Azure MSDN forum. Hope this helps, [...]
Thu, 16 Apr 2015 17:01:22 GMTI’m very excited to announce the general availability release of Azure Premium Storage. It is now available with an enterprise grade SLA and is available for everyone to use. Microsoft Azure now offers two types of storage: Premium Storage and Standard Storage. Premium Storage stores data durably on Solid State Drives (SSDs) and provides high performance, low latency, disk storage with consistent performance delivery guarantees. Premium Storage is ideal for I/O-sensitive workloads - and is especially great for database workloads hosted within Virtual Machines. You can optionally attach several premium storage disks to a single VM, and support up to 32 TB of disk storage per Virtual Machine and drive more than 64,000 IOPS per VM at less than 1 millisecond latency for read operations. This provides an incredibly fast storage option that enables you to run even more workloads in the cloud. Using Premium Storage, Azure now offers the ability run more demanding applications - including high-volume SQL Server, Dynamics AX, Dynamics CRM, Exchange Server, MySQL, Oracle Database, IBM DB2, MongoDB, Cassandra, and SAP solutions. Durability Durability of data is of utmost importance for any persistent storage option. Azure customers have critical applications that depend on the persistence of their data and high tolerance against failures. Premium Storage keeps three replicas of data within the same region, and ensures that a write operation will not be confirmed back until it has been durably replicated. This is a unique cloud capability provided only be Azure today. In addition, you can also optionally create snapshots of your disks and copy those snapshots to a Standard GRS storage account - which enables you to maintain a geo-redundant snapshot of your data that is stored > 400 miles away from your primary Azure region for disaster recovery purposes. Available Regions Premium Storage is available today in the following Azure regions: West US East US 2 West Europe East China Southeast Asia West Japan We will expand Premium Storage to run in all Azure regions in the near future. Getting Started You can easily get started with Premium Storage starting today. Simply go to the Microsoft Azure Management Portal and create a new Premium Storage account. You can do this by creating a new Storage Account and selecting the “Premium Locally Redundant” storage option (note: this option is only listed if you select a region where Premium Storage is available). Then create a new VM and select the “DS” series of VM sizes. The DS-series of VMs are optimized to work great with Premium Storage. When you create the DS VM you can simply point it at your Premium Storage account and you’ll be all set. Learning More Learn more about Premium Storage from Mark Russinovich's blog post on today's release. You can also see a live 3 minute demo of Premium Storage in action by watching Mark Russinovich’s video on premium storage. In it Mark shows both a Windows Server and Linux VM driving more than 64,000 disk IOPS with low latency against a durable drive powered by Azure Premium Storage. You can also visit the following links for more information: Premium Storage overview Premium Storage REST operations "DS" series VM specifications Channel 9 video on Premium Storage Summary We are very excited about the release of Azure Premium Storage. Premium Storage opens up so many new opportunities to use Azure to run workloads in the cloud – including migrating existing on-premises solutions. As always, we would love to hear feedback via comments on this blog, the Azure Storage MSDN forum or send email to email@example.com. Hope this helps, Scott[...]
Tue, 24 Mar 2015 14:23:42 GMTIn a mobile first, cloud first world, every business needs to deliver great mobile and web experiences that engage and connect with their customers, and which enable their employees to be even more productive. These apps need to work with any device, and to be able to consume and integrate with data anywhere. I'm excited to announce the release of our new Azure App Service today - which provides a powerful new offering to deliver these solutions. Azure App Service is an integrated service that enables you to create web and mobile apps for any platform or device, easily integrate with SaaS solutions (Office 365, Dynamics CRM, Salesforce, Twilio, etc), easily connect with on-premises applications (SAP, Oracle, Siebel, etc), and easily automate businesses processes while meeting stringent security, reliability, and scalability needs. Azure App Service Azure App Service includes the Web App + Mobile App capabilities that we previously delivered separately (as Azure Websites + Azure Mobile Services). It also includes powerful new Logic/Workflow App and API App capabilities that we are introducing today for the very first time - along with built-in connectors that make it super easy to build logic workflows that integrate with dozens of popular SaaS and on-premises applications (Office 365, SalesForce, Dynamics, OneDrive, Box, DropBox, Twilio, Twitter, Facebook, Marketo, and more). All of these features can be used together at one low price. In fact, the new Azure App Service pricing is exactly the same price as our previous Azure Websites offering. If you are familiar with our Websites service you now get all of the features it previously supported, plus additional new mobile support, plus additional new workflow support, plus additional new connectors to dozens of SaaS and on-premises solutions at no extra charge. Web + Mobile + Logic + API Apps Azure App Service enables you to easily create Web + Mobile + Logic + API Apps: You can run any number of these app types within a single Azure App Service deployment. Your apps are automatically managed by Azure App Service and run in managed VMs isolated from other customers (meaning you don't have to worry about your app running in the same VM as another customer). You can use the built-in AutoScaling support within Azure App Service to automatically increase and decrease the number of VMs that your apps use based on the actual resource consumption of them. This provides an incredibly cost-effective way to build and run highly scalable apps that provide both Web and Mobile experiences, and which contain automated business processes that integrate with a wide variety of apps and data sources. Below are additional details on the different app types supported by Azure App Service. Azure App Service is generally available starting today for Web apps, with the Mobile, Logic and API app types available in public preview: Web Apps The Web App support within Azure App Service includes 100% of the capabilities previously supported by Azure Websites. This includes: Support for .NET, Node, Java, PHP, and Python code Built-in AutoScale support (automatically scale up/down based on real-world load) Integrated Visual Studio publishing as well as FTP publishing Continuous Integration/Deployment support with Visual Studio Online, GitHub, and BitBucket Virtual networking support and hybrid connections to on-premises networks and databases Staged deployment and test in production support WebJob support for long running background tasks Customers who have previously deployed an app using the Azure Website service will notice today that they these apps are now called "Web Apps" within the Az[...]
Mon, 23 Feb 2015 20:41:56 GMTThe first preview release of ASP.NET 1.0 came out almost 15 years ago. Since then millions of developers have used it to build and run great web applications, and over the years we have added and evolved many, many capabilities to it. I'm excited today to post about a new release of ASP.NET that we are working on that we are calling ASP.NET 5. This new release is one of the most significant architectural updates we've done to ASP.NET. As part of this release we are making ASP.NET leaner, more modular, cross-platform, and cloud optimized. The ASP.NET 5 preview is now available as a preview release, and you can start using it today by downloading the latest CTP of Visual Studio 2015 which we just made available. ASP.NET 5 is an open source web framework for building modern web applications that can be developed and run on Windows, Linux and the Mac. It includes the MVC 6 framework, which now combines the features of MVC and Web API into a single web programming framework. ASP.NET 5 will also be the basis for SignalR 3 - enabling you to add real time functionality to cloud connected applications. ASP.NET 5 is built on the .NET Core runtime, but it can also be run on the full .NET Framework for maximum compatibility. With ASP.NET 5 we are making a number of architectural changes that makes the core web framework much leaner (it no longer requires System.Web.dll) and more modular (almost all features are now implemented as NuGet modules - allowing you to optimize your app to have just what you need). With ASP.NET 5 you gain the following foundational improvements: Build and run cross-platform ASP.NET apps on Windows, Mac and Linux Built on .NET Core, which supports true side-by-side app versioning New tooling that simplifies modern Web development Single aligned web stack for Web UI and Web APIs Cloud-ready environment-based configuration Integrated support for creating and using NuGet packages Built-in support for dependency injection Ability to host on IIS or self-host in your own process The end result is an ASP.NET that you'll feel very familiar with, and which is also now even more tuned for modern web development. Flexible, Cross-Platform Runtime ASP.NET 5 works with two runtime environments to give you greater flexibility when hosting your app. The two runtime choices are: .NET Core – a new, modular, cross-platform runtime with a smaller footprint. When you target the .NET Core, you’ll be able to take advantage of some exciting new benefits: 1) You can deploy the .NET Core runtime with your app which means your app will run with this deployed version of the runtime rather than the version of the runtime that is installed on the host operating system. Your version of the runtime runs side-by-side with versions for other apps. You can update that runtime, if needed, without affecting other apps, or you can continue running on the same version even though other apps on the system have been updated. This makes app deployment and framework updates much easier and less impactful to other apps running on a system. 2) Your app is only dependent on features it really needs. Therefore, you are never prompted to update/service the runtime for features that are not relevant to your app. You will spend less time testing and deploying updates that are perhaps unrelated to the functionality of your app. 3) Your app can now be run cross-platform. We will provide a cross-platform version of .NET Core for Windows, Linux and Mac OS X systems. Regardless of which operating system you use for development or which operating system you target for deployment, you will be able to use .NET. The cross-platform version of the runtime[...]
Wed, 18 Feb 2015 16:06:22 GMTToday we released a number of great enhancements to Microsoft Azure. These include: Machine Learning: General Availability of the Azure Machine Learning Service Hadoop: General Availability of Apache Storm Support, Hadoop 2.6 support, Cluster Scaling, Node Size Selection and preview of next Linux OS support Site Recovery: General Availability of DR capabilities with SAN arrays I've also included details in this blog post of other great Azure features that went live earlier this month: SQL Database: General Availability of SQL Database (V12) Web Sites: Support for Slot Settings API Management: New Premium Tier DocumentDB: New Asia and US Regions, SQL Parameterization and Increased Account Limits Search: Portal Enhancements, Suggestions & Scoring, New Regions Media: General Availability of Content Protection Service for Azure Media Services Management: General Availability of the Azure Resource Manager All of these improvements are now available to use immediately (note that some features are still in preview). Below are more details about them: Machine Learning: General Availability of Azure ML Service Today, I’m excited to announce the General Availability of our Azure Machine Learning service. The Azure Machine Learning Service is a powerful cloud-based predictive analytics service that makes it possible to quickly create analytics solutions. It is a fully managed service - which means you do not need to buy any hardware nor manage VMs manually. Data Scientists and Developers can use our innovative browser-based machine learning IDE to quickly create and automate machine learning workflows. You can literally drag/drop hundreds of existing ML libraries to jump-start your predictive analytics solutions, and then optionally add your own custom R and Python scripts to extend them. Our Machine Learning IDE works in any browser and enables you to rapidly develop and iterate on solutions: With today's General Availability release you can easily discover and create web services, train/retrain your models through APIs, manage endpoints and scale web services on a per customer basis, and configure diagnostics for service monitoring and debugging. Additional new capabilities with today's release include: The ability to create a configurable custom R module, incorporate your own train/predict R-scripts, and add python scripts using a large ecosystem of libraries such as numpy, scipy, pandas, scikit-learn etc. You can now train on terabytes of data using “Learning with Counts”, use PCA or one-class SVM for anomaly detection, and easily modify, filter, and clean data using familiar SQLite. Azure ML Community Gallery that allows you to discover & learn experiments, and share through Twitter and LinkedIn. You can purchase marketplace apps through an Azure subscription and consume finished web services for Recommendation, Text Analytics, and Anomaly Detection directly from the Azure Marketplace. A step-by-step guide for the Data Science journey from raw data to a consumable web service to ease the path for cloud-based data science. We have added the ability to use popular tools such as iPython Notebook and Python Tools for Visual Studio along with Azure ML. Get Started You can learn the basics of predictive analytics and machine learning using our step-by-step data science guide and tutorials. No sign-up or credit card is required to get started using Azure Machine Learning (you can use the machine learning IDE and try experiments for free): Also browse our machine learning gallery to run existing machine learning experiments others have already built - and optionally publish your own e[...]
Mon, 16 Feb 2015 18:16:51 GMT
On March 2nd I'm doing an Azure event in London that you can attend for free. I'll be speaking for about 2.5 hours and will do an end-to-end walkthrough of Microsoft Azure, show off a bunch of demos of great new features/capabilities, and talk about some of the improvements coming out over the next few months.
You can sign-up and attend the event for free (while tickets last - they are going fast). If you are interested sign-up now. The event is being held at the Mermaid Conference & Events Centre in Blackfriars, London:
Hope to see some of you there!
Thu, 11 Dec 2014 19:14:56 GMTToday we released a number of great enhancements to Microsoft Azure. These include: Premium Storage: New Premium high-performance Storage for Azure Virtual Machine workloads RemoteApp: General Availability of Azure RemoteApp service SQL Database: Enhancements to Azure SQL Databases Media Services: General Availability of Live Channels for Media Streaming Azure Search: Enhanced management experience, multi-language support and more DocumentDB: Support for Bulk Add Documents and Query Syntax Highlighting Site Recovery: General Availability of disaster recovery to Azure for branch offices and SMB customers Azure Active Directory: General Availability of Azure Active Directory application proxy and password write back support All of these improvements are now available to use immediately (note that some features are still in preview). Below are more details about them: Premium Storage: High-performance Storage for Virtual Machines I’m excited to announce the public preview of our new Azure Premium Storage offering. With the introduction of the new Premium Storage option, Azure now offers two types of durable storage: Premium Storage and Standard Storage. Premium Storage stores data durably on Solid State Drives (SSDs) and provides high performance, low latency, disk storage with consistent performance delivery guarantees. Premium Storage is ideal for I/O-sensitive workloads - and is great for database workloads hosted within Virtual Machines. You can optionally attach several premium storage disks to a single VM, and support up to 32 TB of disk storage per Virtual Machine and drive more than 50,000 IOPS per VM at less than 1 millisecond latency for read operations. This provides a wickedly fast storage option that enables you to run even more workloads in the cloud. Using Premium Storage, Azure now offers the ability to "lift-and-shift" more demanding enterprise applications to the cloud - including SQL Server, Dynamics AX, Dynamics CRM, Exchange Server, MySQL, Oracle Database, IBM DB2, and SAP Business Suite solutions. Premium Storage is now available in public preview starting today. To sign up to use the Azure Premium Storage preview, visit the Azure Preview page. Disk Sizes and Performance Premium Storage disks provide up to 5,000 IOPS and 200 MB/sec throughput depending on the disk size. When you create a new premium storage disk you get the option to select the disk size and performance characteristics you want based on your application performance and storage capacity needs. For the public preview, we are offering three Premium Storage disk configurations: Disk Types P10 P20 P30 Disk Size 128 GB 512 GB 1 TB IOPS per Disk 500 2300 5000 Throughput per Disk 100 MB/sec 150 MB/sec 200 MB/sec You can maximize the performance of your VMs by attaching multiple Premium Storage disks to them (up to the network bandwidth limit available to the VM for disk traffic). To learn the disk bandwidth available for each VM size, see the Virtual Machine and Cloud Service Sizes for Azure Durability Durability of data is of utmost importance for any persistent storage option. Azure customers have critical applications that depend on the persistence of their data and high tolerance against failures. Premium Storage keeps three replicas of data within the same region. In addition, you can also optionally create snapshots of your disks and copy those snapshots to a Standard GRS storage account - which enables you to maintain a geo-redundant snapshot of your data that is stored at least 400 miles away from your primary Azure region. Learn More You can lear[...]
Wed, 12 Nov 2014 15:45:50 GMTThis week we are holding our Connect() developer event in New York City. This is an event that is being streamed online for free, and it covers some of the great new capabilities coming with the Visual Studio 2015 and .NET 5 releases. You can watch the event live as well as on-demand here. I just finished giving the opening keynote of the event during which I made several big announcements: Announcing the Open Sourcing of the .NET Core Runtime and Libraries Over the last several years we have integrated more and more open source technology into our .NET, Visual Studio, and Azure offerings. We have also begun to open source more of our own code and technology as well. Earlier this year, at the Build 2014 conference, I announced the creation of the .NET Foundation – which is an independent organization designed to foster the development and collaboration of open source technologies for .NET. We have now open sourced ASP.NET, EF, Web API, NuGet and the "Roslyn" C# and VB compilers under it. It has been great to see the energy and innovation in these technologies since we made the open source announcements. We continue to have dedicated Microsoft teams working on each of them (several of the teams have more developers than ever before). The open source process is now enabling the teams to collaborate even more with other developers in the community, and every single one of the above projects have now accepted code contributions from developers outside Microsoft. The combination is enabling an even richer flow of ideas, and even better products. Open Sourcing the .NET Core Runtime and Libraries Today I’m excited to announce that we are going even further, and will be open sourcing the .NET Core Runtime. This will include everything needed to execute .NET code – including the CLR, Just-In-Time Compiler (JIT), Garbage Collector (GC), and core .NET base class libraries. We are releasing the source under the MIT open source license and are also issuing an explicit patent promise to clarify users patent rights to .NET. This morning, we published the public repository on GitHub where the project will be hosted: https://github.com/dotnet/corefx Today’s source release includes many of the newer core .NET framework libraries (ImmutableCollections, SIMD, XML and MetadataReader). These libraries are fully open, and are ready to accept contributions. Over the next several weeks and months we will continue to transfer source (including the Core CLR which is not there right now but in the process of being moved) into the repository and likewise make it open for contributions. What does this open sourcing mean? Today’s open source announcement means that developers will have a fully supported, fully open source, fully cross platform .NET stack for creating server and cloud applications – including everything from the C#/VB compilers, to the CLR runtime, to the core .NET base class libraries, to the higher-level .NET Web, Data and API frameworks. It is an exciting day for .NET, and the new open source process will allow the .NET teams in Microsoft to collaborate even more deeply with other developers around the world. The result is going to be even better products for everyone. Announcing .NET Core Framework on Linux and OSX Last month at a Cloud Event we held in San Francisco, Satya Nadella – our CEO – showed a slide like this one where he talked about how Microsoft loves Linux: We’ve worked hard with Azure to make it a first-class cloud platform for Linux based applications, and share[...]
Fri, 31 Oct 2014 06:39:05 GMTThe last three weeks have been busy ones for Azure. Two weeks ago we announced a partnership with Docker to enable great container-based development experiences on Linux, Windows Server and Microsoft Azure. Last week we held our Cloud Day event and announced our new G-Series of Virtual Machines as well as Premium Storage offering. The G-Series VMs provide the largest VM sizes available in the public cloud today (nearly 2x more memory than the largest AWS offering, and 4x more memory than the largest Google offering). The new Premium Storage offering (which will work with both our D-series and G-series of VMs) will support up to 32TB of storage per VM, >50,000 IOPS of disk IO per VM, and enable sub-1ms read latency. Combined they provide an enormous amount of power that enables you to run even bigger and better solutions in the cloud. Earlier this week, we officially opened our new Azure Australia regions – which are our 18th and 19th Azure regions open for business around the world. Then at TechEd Europe we announced another round of new features – including the launch of the new Azure MarketPlace, a bunch of great network improvements, our new Batch computing service, general availability of our Azure Automation service and more. Today, I’m excited to blog about even more new services we have released this week in the Azure Data space. These include: Event Hubs: is a scalable service for ingesting and storing data from websites, client apps, and IoT sensors. Stream Analytics: is a cost-effective event processing engine that helps uncover real-time insights from event streams. Data Factory: enables better information production by orchestrating and managing diverse data and data movement. Azure Event Hub is now available in general availability, and the new Azure Stream Analytics and Data Factory services are now in public preview. Event Hubs: Log Millions of events per second in near real time The Azure Event Hub service is a highly scalable telemetry ingestion service that can log millions of events per second in near real time. You can use the Event Hub service to collect data/events from any IoT device, from any app (web, mobile, or a backend service), or via feeds like social networks. We are using it internally within Microsoft to monitor some of our largest online systems. Once you collect events with Event Hub you can then analyze the data using any real-time analytics system (like Apache Storm or our new Azure Stream Analytics service) and store/transform it into any data storage system (including HDInsight and Hadoop based solutions). Event Hub is delivered as a managed service on Azure (meaning we run, scale and patch it for you and provide an enterprise SLA). It delivers: Ability to log millions of events per second in near real time Elastic scaling support with the ability to scale-up/down with no interruption Support for multiple protocols including support for HTTP and AMQP based events Flexible authorization and throttling device policies Time-based event buffering with event order preservation The pricing model for Event Hubs is very flexible – for just $11/month you can provision a basic Event Hub with guaranteed performance capacity to capture 1 MB/sec of events sent to your Event Hub. You can then provision as many additional capacity units as you need if your event traffic goes higher. Getting Started with Capturing Events You can create a new Event Hub using the Azure Portal or via the command-line. Choose New->App Se[...]
Tue, 28 Oct 2014 14:35:34 GMTToday we released a major set of updates to Microsoft Azure. Today’s updates include: Marketplace: Announcing Azure Marketplace and partnerships with key technology partners Networking: Network Security Groups, Multi-NIC, Forced Tunneling, Source IP Affinity, and much more Batch Computing: Public Preview of the new Azure Batch Computing Service Automation: General Availability of the Azure Automation Service Anti-malware: General Availability of Microsoft Anti-malware for Virtual Machines and Cloud Services Virtual Machines: General Availability of many more VM extensions – PowerShell DSC, Octopus, VS Release Management All of these improvements are now available to use immediately (note that some features are still in preview). Below are more details about them: Marketplace: Announcing Azure Marketplace and partnerships with key technology partners Last week, at our Cloud Day event in San Francisco, I announced a new Azure Marketplace that helps to better connect Azure customers with partners, ISVs and startups. With just a couple of clicks, you can now quickly discover, purchase, and deploy any number of solutions directly into Azure. Exploring the Marketplace You can explore the Azure Marketplace by clicking the Marketplace title that is pinned by default to the home-screen of the Azure Preview Portal: Clicking the Marketplace tile will enable you to explore a large selection of applications, VM images, and services that you can provision into your Azure subscription: Using the marketplace provides a super easy way to take advantage of a rich ecosystem of applications and services integrated to run great with Azure. Today’s marketplace release includes multi-VM templates to run Hadoop clusters powered by Cloudera or Hortenworks, Linux VMs powered by Unbuntu, CoreOS, Suse, CentOS, Microsoft SharePoint Server Farms, Cassandra Clusters powered by DataStax, and a wide range of security virtual appliances. You can click any of the items in the gallery to learn more about them and optionally deploy them. Doing so will walk you though a simple to follow creation wizard that enables you to optionally configure how/where they will run, as well as display any additional pricing required for the apps/services/VM images that you select. For example, below is all it takes to stand-up an 8-node DataStax Enterprise cluster: Solutions you purchase through the Marketplace will be automatically billed to your Azure subscription (avoiding the need for you to setup a separate payment method). Virtual Machine images will support the ability to bring your own license or rent the image license by the hour (which is ideal for proof of concept solutions or cases where you need the solution for only a short period of time). Both Azure Direct customers as well as customers who pay using an Enterprise Agreement can take advantage of the Azure Marketplace starting today. You can learn more about the Azure Marketplace as well as browse the items within it here. Networking: Lots and lots of New Features and Improvements This week’s Azure update includes a ton of new capabilities to the Azure networking stack. You can use these new networking capabilities immediately in the North Europe region, and they will be supported worldwide in all regions in November 2014. The new network capabilities include: Network Security Groups You can now create Network Security groups to define access control rules for inbound and outbound traffic to a Virtual machine or a group of virt[...]