Roger Doherty
SQL Down Under Show 48 - Guest: Roger Doherty - Published: 21 Dec 2011
This show features SQL Server program manager Roger Doherty introducing SQL Server 2012.
Details About Our Guest
Roger Doherty is the Senior Program Manager or technical evangelist with the SQL Server engineering team.
Show Notes And Links
Show Transcript
Greg Low: Introducing Show 48 with guest Roger Doherty.
Welcome, our guest today is Roger Doherty. He is the Senior Program Manager or technical evangelist with the SQL Server engineering team. Welcome back to the show Roger.
Roger Doherty: Thanks Greg, good to be back.
Greg Low: If people want to hear more about your background, they can go back listen to the earlier part of the last show. The reason we have got you back this time is of course SQL Server 2012 being imminent, so tell us why we should be interested in that?
Roger Doherty: Thanks for having me back Greg and it is good to have the band back together for this release. The SQL 2012 release itself is a pretty interesting and exciting release for us. There is a lot happening in the database space as a whole. SQL Server 2012 itself really spans a whole variety of new capabilities for developers, for DBAs, for IT professionals and even all the way down to the information worker. The really interesting happening is the kind of the new choices that our customers are going to have in terms of how to deliver those capabilities.
Greg Low: Importantly this is, I suppose we should say a full release rather than a point release, like the previous version?
Roger Doherty: Yes absolutely, it is a full release. The product has grown in terms of its surface area over the year, so that full release means different things depending upon which component you are talking about. From the database engine perspective, this will be a new database capability level. So if you are interested in supporting SQL Server on premise and you have an application that targets SQL Server 2008 R2 or some previous release, you should do a bit of testing to make sure it works. Part of that testing would involve flipping compatibility to the new level 11 which is the one of course comes with SQL Server 2012.
Greg Low: I think the thing I noticed yesterday, we were doing some work on some content for courseware and what struck me was just how much additional code is shipped with this version? I looked at the installer iso for example in 2008 R2 it was about 2.5 gigabytes, this time 4.4 gigabytes.
Roger Doherty: There is quite a bit of new technology in this release that we are pretty excited about. Like I said it spans quite a few different audiences.
Greg Low: I might get you to drill into when you talk about some of the options there, there seems to be big push in terms of managed services. Perhaps we should start there.
Roger Doherty: It is a good way of kind of talking about things about a lot of the changes occurring internally at Microsoft in the industry as a whole in terms of how you deliver software. We are taking that self-managed platform that most Microsoft shops have grown up and were projecting it into a couple of delivery platforms. Those would include what we recall the infrastructure as a service, which is IAAS. The whole idea there is to abstract yourself from kind of like the hardware, the network layers. You are still dealing with computers but you don’t have to necessarily go out and buy a SAN or buy new server every time you want to roll out a new database server. We are doing a lot of testing with SQL Server 2012 to make sure that it runs really well in a virtualized kind of configuration. Whether that is hosted on premise, or you own private cloud configuration or whether hosted in somebody else’s datacenter, like Windows Azure or Amazon or Rackspace you name it. Right.
Greg Low: Yes.
Roger Doherty: That is a big push right there.
Greg Low: Yes I noticed there seems to be a blend in the diagrams from sort of a platform from one end which is self-managed through increasing amounts till you get to a point software as a service at the other end.
Roger Doherty: Yes the whole idea managed services is pushing itself up the stack. Right. So what we just talked about was infrastructure of services really kind of abstracting the hardware out of the equation. Right. When you are given the platform as a service, you are abstracting all of your kind of systems, database and middleware and run time technologies out and you are really just purely dealing with the service at that point. The way we are delivering that is with SQL Azure on Windows Azure, so you can just go up to Windows Azure and say create me a new logical server. 30 seconds later you have an end point that is basically a fully functional SQL based database that you can use to build applications.
Greg Low: Yes, I find actually the newest user interface on that could as well. We have been a little bit of work on the SQL Azure lately. It is interesting the one comment that came back from a number of the guys was that it was fun.
Roger Doherty: Yes, it is a very interesting new way of thinking about building applications. The team that I am on has had a big hand in helping design the user experience around that, around that SQL Azure management portal that you just described. The key is that we will support all the existing old school stuff that you expect from us too, like Management Studio even SQL CMD if you have the right ports open to your SQL Azure database. But we also want to make it easy for you with no tools at all and all go up there and get some work and create a new database, populate some data. We are trying to build out an excellent web experience for that as well.
Greg Low: Where are we at in terms of compatibility now, between the SQL Azure and the on premise?
Roger Doherty: A bit of a journey. I think there are still some rough edges there. The goal moving forward is complete parity, but given that this is a relatively new area I think you will see some continuing differences between them as we move forward. The good news here in terms of the differences in the way SQL Server behaves from SQL Azure is that the engineering teams have merged. They are all checking into the same source code base so for example if you are using Transact SQL and you use a union operator or something in your Transact SQL, it is the exact same code running on both SQL Server and SQL Azure to process that you are not dealing with two different flavors of SQL Server.
Greg Low: What was interesting too, was actually porting across a number of scripts and things to get them to run. I was pleasantly surprised by the number of objects that are now supported compared to previous versions.
Roger Doherty: Yes, we are making some progress there. I think the first hello world thing that most SQL Server geeks tried to push Adventure Works up there right. Most of them failed miserably in early release of SQL Azure simply because we didn’t have support for all the little specialized object types in SQL, spatial and things like that.
Greg Low: Yes.
Roger Doherty: As you mention we move forward and from a quarterly basis we are rolling out new functionality in the cloud. As a matter of fact a lot of these SQL Server 2012 improvements that we will talk about on the call today are already live in SQL Azure.
Greg Low: Yes, I noticed there was a data platform requirement in your materials where you broke out the different areas of investment. Perhaps we can look at a few of those.
Roger Doherty: Cool. I like to break up the release into four buckets. The first one is kind of like the crown jewel of SQL Server and that is related to the database engine or what I would call Database Services. So we have a new release of the database engine with a bunch of important innovations there. As I have already mentioned, that database engine technology is the same code now running across SQL Server 2012 and SQL Azure. Different features and capabilities will be turned on depending on where it is running but it is the same code underneath. A lot of the really interesting and important new improvements in the database engine would be things like our new column store index.
Greg Low: Yes.
Roger Doherty: So that technology is really about flipping the old side and we go through and we index on a column basis. Compress it all down into blobs that can be read and taken into memory very rapidly. What you get are some insanely good query performance for very large data warehousing kind of queries. As opposed to what you can get on previous B-tree indexing technology and that is shared technology across SQL Server engine team and the team that is responsible for Power Pivot.
Greg Low: Yes which again in the case Power Pivot, we see quite startling levels of compression in the data and even though we seem to be doing a brute force search because we seem to be able to get so much of the data in memory and it is so compressed. The performance ends up to being quite something.
Roger Doherty: Yes it is an exciting improvement. Something that is long overdue but frankly for people have been buying specialized database server platforms to do this kind of work and the paths. Once again we are going to provide this in the box. Much lower entry point for folks that want to advantage of that technology.
Greg Low: Other things in the database engine. I was intrigued with some of the changes around full-text and semantic search and so on. One of the events that we attended middle of last year where they covered off four things happened in Denali. I thought the couple of hours that Michael Rhys was presenting on beyond relational and full-text and so on were some of the best material of the whole week actually.
Roger Doherty: Yes, I am equally excited about those capabilities. What you are seeing are some of the early investments that we made in technology like FileStream really start to come into their own and start enabling some pretty exciting new capabilities. I like to describe that SQL Server branching out and handling different types of storage.
Greg Low: Yes.
Roger Doherty: We are all familiar with standard relational table storage that is really good at strings and numbers but maybe not so good at large binary objects like images and documents or things of that nature.
Greg Low: Yes, I find what fascinates me there is the thing I love with full-text storage already, is that it lets you sort of build interfaces that are much more like what humans what rather than what IT people want. I find IT people like everything nice and neat and precise but users just like stuff that is all soft and fuzzy.
Roger Doherty: Yes bring it all together right, they don’t really care where it is stored. They just want access to their information. They don’t want to go back through five different pieces just to get at their data, right.
Greg Low: Yes, what I like, I think the best indications in that in recent years is if I look at the mapping program, Bing maps I think if people think back to when they were first introduced the user interface had lots of separate little boxes where you had to go street number this and suburb this. All of those have now evolved to a single textbox that says what are you looking for?
Roger Doherty: Exactly.
Greg Low: Yes we don’t build
Roger Doherty: Geo located data. Yes exactly, you get all kinds of rich information back, documents.
Greg Low: Yes I can type Power and it goes straight to it. That is thing I find we still build business applications like the old versions of those where we have all the little boxes and I think full-text lets you build much more sophisticated interfaces and it understands language. That’s the important thing, so drive, drove and driving are all too human, the same concept that lets you deal with that.
Roger Doherty: Totally agree and a certain regard, what you are describing is the consumerisation of line of business software. People have better tools at home for the consumer lives than they do at work?
Greg Low: Exactly.
Roger Doherty: What are we doing in the platform to enable those kinds of experience and this category of improvement that you are describing are really fundamental to that right.
Greg Low: So semantic search.
Roger Doherty: The FileStream capability is really about going out and doing a better job of managing other types of data than just relational data. What we have done is we have layered on top of that another improvement called File Table. What a File Table is, think of it as cube added table, on one hand you can talk to it using Transact T-SQL, query it and pull documents out of it. On the other hand you can talk to it using any SMB application like Windows Explorer, Word, Excel anything that talks SMB. They are both writing and reading from the same location but it is all managed by SQL Server from an administrative perspective and a security perspective. This removes the one fundamental road block that prevented people from using technology like file stream. How do I actually get my content into SQL Server?
Greg Low: Yes.
Roger Doherty: It is now just a drag and drop and copy operation. It is easy as that to move content in and out from SQL Server using this new file table construct.
Greg Low: I noticed also that one the restrictions before we couldn’t use FileStream with mirroring but now at the high availability options available in this version those types of restrictions disappear as well.
Roger Doherty: Yes, we took the training wheels off. We decided technology for managing new rich types of data is important enough to make it highly available. Our new AlwaysOn high availability and disaster recovery technology fully supports FileStream data as part of your high availability and disaster recovery configuration.
Greg Low: Yes actually before we talk AlwaysOn, which I think probably should be next. There is also the semantic search layer that has been added as well.
Roger Doherty: Yes you mention full-text indexing and that is the technology that is well known and loved by web developers but not so much by people who build OLTP applications or data warehousing applications. It might be worth a quick introduction to what full-text search is. Full-text search lets you do inherently non-relational searches against content. What is content? Content would be something as small as some memo field that you type data into and varchar(4000) field. It could be some XML documents that you have stored in an XML column or it could be some Word or Excel documents that you have stored in a file table. Any of those qualify as content and you can create full-text indexes to aggregate the textual information in that content and enable very powerful search scenarios over the top of it. The typical example you were mentioning before Greg was I am looking at a string for a customer and I have got my data broken out into different columns, customer name, customer address, you know, province. Traditionally we would create indexes on individual columns and do searches but if I am just looking for a word regardless of which column it shows up in that is pretty crazy SQL query. It is extremely easy to do with full-text index right? You just say hey look for this word like I am on unhappy like I am a customer service app and I am looking for just satisfied customer. I can search for the word on unhappy and have it show up whichever column it happens to be stored in if I have full-text indexing technology there. That technology growing up in SQL 2012 and is getting more robust and supporting high availability and disaster recovery and new search options. The really exciting thing is the one that you mention which is the semantic search capability.
Greg Low: This is a nice new layer over the top that is more targeted at getting meaning out of text.
Roger Doherty: Exactly, the way I like to describe it is using the concept of tag clouds. Anybody who is familiar with writing blog you will go in and you will write a blog post and you will create a data set of tags that describe the contents in your blog post. Let’s say Greg that you and I are contributing the same blog, what is to guarantee that you and I will use the same taxonomy for the tags we use to describe the blog post?
Greg Low: Yes, almost zero.
Roger Doherty: Yes exactly, so what you wind up with this very inconsistent semi usable kind of taxonomy for searching things. Well semantic search solves that problem. It creates a statistical algorithm that is language aware and will go up there and as part of your full-text index will do term extraction and it will understand the structure of the language for the underlying content and will find out what that content is talking about and it will create a set of terms and rank those terms based upon how relevant they are for the underlying content. So using that you can automatically create a tag cloud over your content as an aggregate over individual pieces of content. You can do very powerful searches like find me things that are like this document that I am looking at now. If you think about the implications of that for content management systems like insurance industry, legal industry the applications are huge.
Greg Low: You also touched on AlwaysOn and we should mention that.
Roger Doherty: As we have grown, we have continued to add mission critical capabilities for SQL Server for allowing you to make your application highly available and providing disaster recovery options. In SQL Server 2012 we have got a major rationalization series of improvements around that technology called AlwaysOn. It makes it possible for you to implement a high availability and disaster recovery solution that combines what I would call database redundancy which we use to call mirroring into this set of capabilities called availability groups. The basic idea behind this is, let’s say I am an ERP application, in my ERP application four databases that have to be online in order for application itself to be functional. All I do in AlwaysOn is add those four databases to my availability group and then I can create replicas of those things and up to four different nodes.
Greg Low: Yes I think that is a really key change there too, the idea with previously with mirroring we had one replica and it was either synchronous or asynchronous but now we have lots of choices.
Roger Doherty: Right so for each replica you can define how it synchronous it is. Synchronous obviously requires more bandwidth but there is zero possibility of data loss there. Asynch requires less bandwidth but you might lose a little bit of data on an asynch scenario and if there is a fail over of that, right.
Greg Low: Yes, now we can have up to four replicas and two of them synchronous. If we want combinations of synch and asynch, it means I can have local high availability but long distance disaster recovery.
Roger Doherty: Yes I can do all that on commodity hardware as well with availability groups which is also very exciting stuff. You can think of availability groups as the new mirroring on steroids.
Greg Low: Yes I think the other notable thing there was the concept of a listener. Where we can now define a specific name for client applications and things to connect to simply the redirection of when things do failover.
Roger Doherty: Yes, that is absolutely critical. In the past mirroring we had the special node that we called the witness. That all goes away, we now use built in capabilities of Window Server called Windows Server Failover Clustering to create a durable kind of listener that you would use to connect your applications to. When the applications connect to the listener, the listener then routes them to wherever the primary is. This is really transparent to the application tier.
Greg Low: The final thing you had in your database services area was the SQL Azure data sync.
Roger Doherty: Yes, so we have a set of technologies for replicating and synchronizing data. Right now we have two flavors of that. One for on premise world, which is SQL Server replication. That really isn’t really changing much in the Denali wave and then we have something for the cloud world which is called SQL Azure data sync. SQL Azure data sync is really for the off line sync scenario for scheduled synchronization so I can have a data set. I need a master copy of it up in SQL Azure, I can then create copies of that other SQL Azure nodes even in different data centers or on premise. So I could have a kind of hub and spoke typology where I have got my master data up in SQL Azure, say in the South Central US data center. I can have a node out in South East Asia running in SQL Azure there and I can have a node in my on premise environment, running on SQL Server in my data center in Chicago.
Greg Low: Yes.
Roger Doherty: Create that whole typology, specify who wins if a conflict is detected and specify schedule on which you want to sync and behind the scenes then SQL Azure will make it happen.
Greg Low: While on the topic moving data around that then leads into the Data Integrations Services pillar.
Roger Doherty: Yes.
Greg Low: One of the things that I am most excited about in this version is actually Integration Services. If I look at SQL Server 2008 R2 it is said to be one of the things I most loved about that version I always love the little things that won’t make it onto a brochure. I love it when lots and lots and lots of little improvements occurred and I think reporting services that happened in 2008 R2 but this time integration services.
Roger Doherty: Yes this time Integration Services is going through a major overhaul but I don’t want to scare anybody. We are not going break your SSIS packages like we did when we moved from DTS to SSIS, so don’t worry about that. We will have functional compatibility mode that you can run your existing SSIS packages in unchanged on SQL Server 2012. That is a big deal. If you want to leverage some of the new functionality we have a great new design experience. Things should have been there a long time ago, like undo and redo and the package design experience that is there.
Greg Low: That is a classic example of something that you don’t usually see on the brochures of things.
Roger Doherty: You are not going to make a big deal about having undo, redo support right.
Greg Low: Whenever you show somebody and show it in a room of people that is one of woo hoo moments.
Roger Doherty: That applies to Integration Services for sure. I think the big deal with Integration Services is we now have a proper server administration model for it. We have this weird with DTExec where you had this binary executable that you would use to run your packages and people would inadvertently fire up a package. Think it was running on the server but in actual reality it was running on the laptop which could be a bad thing. You now actually have a whole new kind of set of administration tools and catalogues. Right within SQL Server Management Studio where you can install your packages, execute them, monitor them, configure them, troubleshoot them, and it is all tied together very nicely. So big investments in the server tooling around Integration Services.
Greg Low: Yes I thought the two most noticeable things there is the idea you now deploy packages sorry projects rather than just packages and you can have sort of connections and parameters and things at the project level which is really sweet. The one I am most impressed about is the idea that because that now lives in a database, (a) it is instance aware which it wasn’t before and other thing is we can now execute packages as you said but using T-SQL. I can execute T-SQL commands to fire and run packages.
Roger Doherty: Yes I can write a procedure that runs SSIS packages with no problem.
Greg Low: Yes. Master Data Services in this release, where are we at with that?
Roger Doherty: The technology for Master Data Services showed up in SQL Server 2008 R2. That was kind of an acquired database technology so we first got it into the box in that wave and it was just a big effort just to make that happen. I think in the SQL Server 2012 release you will start to see some new functionality that will show up in Master Data Services that people really want to see. Much better integration with Excel, in terms of managing various repositories that are involved in setting up a Master Data Management solution.
Greg Low: Yes I think if I was editing a whole lot of products, Excel is the place to be doing that rather in Windows inside MDS.
Roger Doherty: Yes, most definitely we don’t want to be in the business of creating the user interface for that and most people understand in Excel that is a good way to go. I think for anybody looking to wire together disparate information systems that share common data sets Master Data Services is a good set of frameworks and tools to help you implement that.
Greg Low: Yes I think the one that intrigues a lot of people in this section is now the introduction of Data Quality Services.
Roger Doherty: Yes this is a good one. I think most SSIS developers like myself think they are capable. Just pulled my headset out, sorry about that. I got a little bit excited on this end.
Greg Low: (Laughs)
Roger Doherty: Most SSIS developers think they are capable of writing the ultimately flexible SSIS package that will handle any kind of dirty data. The reality is that it is basically impossible particularly when you are getting data in from places like the internet or new customers that you are onboarding. You really can’t predict how clean their data is going to be and whether or not it can be introduced into your system. People that really understand this are the domain experts for that data and they have different titles and some of them are called Data Analysts, some of them are called Business Analysts, some are called Auditors. We really don’t have tooling for those people, domain experts of data and that’s what Data Quality Services is designed for. Tooling for those people, it helps them to iteratively build a knowledge base for scrubbing dirty data and getting it into one where it can be introduced into the system. Once that knowledge base has been built out, then developers for tools like Integration Services can leverage it as part of operational data cleansing and scrubbing initiatives.
Greg Low: The other one in that category is StreamInsight and a lot of people were surprised when it appeared in the product in 2008 R2. Changes in this version?
Roger Doherty: I am a big StreamInsight advocate, when you think that colour of data Integration Services you are really looking at different workloads. Integration Services is about ETL workload, it is about Extract Transform Loads. Master Data Services is about MDM work (Master Data Management). Data Quality Services is a set of tooling for people who need to scrub dirty data but end users. StreamInsight is for a whole new workload that most people in the SQL Server space might be unfamiliar with called CEP or Complex Event Processing. Essentially CEP a scenario where you have extremely high volume of events streamed data flying at you at a very high rate and you need to reason over the top of that in real time. You don’t have time to persist it to a database, transform it into data warehouse, build a cube and run a report. By the time all that happens.
Greg Low: It’s too late.
Roger Doherty: The thing you need to be alerted on, may have been long in the past.
Greg Low: Look that’s a great integration story, then the next pillar you had was Analytical Services. This one in the community seems to have raised a lot of discussion. As soon as there was discussion around Analysis Services now having a new option around the model. There is a lot of concern about what that means for previous investments and things.
Roger Doherty: It’s true, we have a lot of new functionality for that kind of number crunching capability that we have in Analysis Services. Most people are probably aware that they team that invented Power Pivot came out of the Analysis Services team and is still part of the Analysis Services team. What we are trying to do here is we are trying to build those two worlds together. We have this kind of highly productive, very agile data warehousing sand box thing that we have done with Power Pivot. We have this highly structured, extremely flexible and highly complex thing that we have done with Analysis Services and MDX. Over time, we like to minimize the differences between those things and move on and move the whole thing forward.
Greg Low: The important message anyway is Analysis Services can support the tabular models that are similar to the Power Pivot experience and also they were listed as called Multi-Dimensional and Data Mining Models.
Roger Doherty: Yes, the big news there is you now have an option when you install Analysis Services to run in a couple of different modes. You can run it in the Multi-Dimensional/Data Mining mode which is the existing MDX, UDM, OLAP scenario that we have building in the past 10 plus years. That will work great and we will continue to move that forward. You can also install it in Tabular mode, and when you install it in Tabular mode what you can do is basically publish Power Pivot models up to your Analysis Services instance. Previously you could always publish Power Pivot models up to Power Pivot server running in SharePoint. This gives you a new potential destination to publish your Power Pivot models. Why is that interesting to Analysis Services people? It is interesting because once you do that, once you publish it to Analysis Services, you can now get access to some of that good in memory compression technology that was formerly only available if you were running in a SharePoint world.
Greg Low: Yes.
Roger Doherty: Number two, you can get at that data using the traditional MDX interfaces that most cube browsers and OLAP applications were built to leverage. That new Tabular Model supports dimensions, it supports hierarchies, it supports KPIs and it has the ability to service MDX queries. It is a great bridge between this Power Pivot world and the existing very high end Analysis Services OLAP mode.
Greg Low: Yes, the DAX language instead of just being measures and calculated columns now becomes a fully-fledged query language for running against it as well.
Roger Doherty: Yes, so not only are we bridging back MDX developers, we will be able to talk to these new Tabular Models but at the same time we are making the Power Pivot world more capable and more functional with a new version of DAX. Just as you described so there will be a new Power Pivot client for Excel, there will be a new Power Pivot server for SharePoint and those will be the two environments that you would use that new DAX capability that you just mentioned.
Greg Low: Yes I think one of the powerful aspects is that we now have a choice between developing models inside Power Pivot in Excel like we had before. We also have the designer as a much richer experience inside Visual Studio as part of SQL Server data tools.
Roger Doherty: That really gets to the heart of what we trying to accomplish here. We have this proliferation of BI layers or BI models. We have Report Models, we had UDM and you had to create multiple data layers depending upon user experience you wanted to drive. We would like to get to a world where there is one Meta data layer. It is this thing called the BI Semantic Model, this is all of your user experiences for Scorecards and Dashboards. For Excel type users, for operational reporting, for all of them. The design experience for that is indeed something that will be delivered in two ways. Through Excel for the basic user who wants to get up and running right now and then we will have richer design experience in Visual Studio as well for more professional developer.
Greg Low: A few nice things I noticed in Power Pivot or in the Tabular Model this time around. Support for hierarchies seem to be an aspect that has been well received. Plus the ability to have multiple relationships, even though only one is active between tables.
Roger Doherty: And calculated measures, these are all concepts well known and well understood by the Analysis Services community when they first looked at Power Pivot and things weren’t there and they were very confused. Having them there now is really kind of the bridge we needed to get those existing MDX, OLAP developers more comfortable with this. You know the BI Semantic Model, Tabular Model technologies.
Greg Low: I suppose when you talk about getting corporate folk interested as well. There is a nice security layer and there is also the introduction of things like perspectives as well.
Roger Doherty: Yes indeed so before with Power Pivot, if you had access to the Power Pivot workbook you had access to everything in it. Now with lower base security and perspective support you can constrain people to sections of your model based upon who they are.
Greg Low: Yes, so the final pillar, is Reporting Services areas. First up we now have another option, SQL Azure Reporting.
Roger Doherty: Yes, the good news there is Reporting Services is always from its infancy been delivered as a web service. In this new world, where you can choose where to deliver you service. It should be relatively easy to spin up a web service and that’s exactly what we did. We took the core Reporting Services capabilities and delivered that as a service on Windows Azure. Now what you can do is if you have a Windows Azure subscription you can go up and provision a new Report Server and in less than a minute have an end point that you can publish your report and render them.
Greg Low: Yes, I suppose the additional nice thing there is that it minimizes a whole lot of the traffic if your data happens to be in SQL Azure as well.
Roger Doherty: Exactly, so this is the ideal scenario if you are a Windows Azure developer and you need to do some operational reporting and you want to keep everything inside the Windows Azure data center so you are not being charged a lot of data egress for Reporting.
Greg Low: Yes, anything around traditional Reporting Services at all? I suppose we would have the new designer experience inside SQL Server data tools.
Roger Doherty: That is an interesting one that you just mentioned there. We do have this new term that we are using called SQL Server Data Tools. The idea there is all the designers and project systems used to assemble a SQL Server application ideally should be available to the developer either as a standalone capability or integrated with Visual Studio. SQL Server 2012 is the first release of this and we are kind of half way there. Ok. So we have a free version of SQL Server Data Tools that only has the new database designer projects and capabilities available to the web platform installer. You will be able to go up and install SQL Server Data Tools and start building database applications without even having SQL Server on your box which is pretty cool, right?
If you need to do more sophisticated things like BI functionality, you are actually going to need to install SQL Server 2012 to get those systems and designers.
Greg Low: Yes.
Roger Doherty: The good news is that once you do that everything is all running in the same place. If you don’t have Visual Studio on your machine we install a Visual Studio shell and we call that SQL Server Data Tools and both database and BI designers are both running in that. If you have Visual Studio 2010 on your machine, we just show up as project systems and designers as part of your core Visual Studio environment.
Greg Low: The final jewel in the crown in this lot is the Power View.
Roger Doherty: Yes when you look at Reporting Services, pretty much the same for the 2012 release. The only new functionality in Reporting Services is some nice end user alerting capabilities that we are delivering when you are running Reporting Services in SharePoint integrated mode. For those shops that do use Reporting Services in SharePoint integrated mode we have improved the administration, installation and configuration management of that because we are running that as a shared service now as opposed to an external process running as an instance of Reporting Services. It is much more SharePoint and your SharePoint administrator should be more comfortable with it. If you install in that mode you now have data alert capability which allows you to run alerts. Instead of having to manually open and read every report, you basically set up alerts to inform you when something is worth looking at. On top of Reporting Services we use to have this technology called Report Builder.
Greg Low: And still do.
Roger Doherty: We still have it and that was a bit of Freudian slip there we are announcing the official deprecation of Report Models in SQL Server 2012. Report Models was supposed to be the reporting tool for people that didn’t SQL and we had limited success with that.
Greg Low: Yes I found that they were required for Report Builder 1 but ever since Report Builder 1 we have had the ability to work in 2 and 3 without those and I find most people have chosen to work without those.
Roger Doherty: In essence, turning Report Builder into the report design tool for people that don’t want to install Visual Studio.
Greg Low: Yes.
Roger Doherty: Really that’s how I think of Report Builder, it is just a different way of designing reports with a slightly user interface and a little bit more end user friendly than you would have if you were running in Visual Studio but the bottom line is that you still to know SQL if you are not going to use Report Models.
Greg Low: Yes I think that is one of the things is that over the last few versions I had this feeling that Report Builder was moving upwards in terms of the capability that someone needed to use it actually. I think even the fact that you open it up in 2008 R2 and used words like data set and so on that is lot more ITish.
Roger Doherty: Yes, you know 90% the non-technical users out there wouldn’t use the term data set.
Greg Low: The nice thing now is now Power View now takes a lot of that away.
Roger Doherty: There you go, that is a perfect Segway here. In the old world, or the existing kind of framework for Report Services you have a developer or a Reporting Services savvy person go layout a report and publish that report and then people can go and render that report. There is that cycle that happens around those activities. The Power View what happened was the Reporting Services team and the Analysis Services team together and they looked at the cool things that the Power Pivot were doing for Excel users and they said how can we extend the experience like that for Self Service Reporting. The good news is that both Power Pivot and Power View run off the same underlying infrastructure which is that new BI Semantic Model we were talking about. What Power View gives you is a very rich interactive data exploration experience, where users just pick things up and drop them on a canvas and play until they get where they kind of want to be for the reporting action that they want to perform.
Greg Low: Yes I think one of the things that I quite like about this as well, was even though it was a very end user tool, it actually has quite a bit of developer surface in that you can get in and make decisions about when someone clicks on something what will appear by default and so on.
Roger Doherty: The best Power View experience, is one that is powered by the BI Semantic Model that was thoughtfully laid out by a developer. All the access paths to the data are well defined. You turned off things that are noise and focused on just the things you are actually interested in. You do that work upfront and you are going to have a much more productive experience by your end users. So that is really the developer story for Power View is go off and build an excellent BI Semantic Model and publish that so that users can go and party on it in Power View.
Greg Low: Roger there is so much to look forward to in SQL Server 2012. I am sure the question everybody would want me to ask is when?
Roger Doherty: Yes, the official public information that we have released is that it will ship sometime in the first half of 2012 so just around the corner.
Greg Low: That is just outstanding. Listen so where will people see more things, or where will they see you or any of things in the upcoming months?
Roger Doherty: You will see quite a bit activity around the SQL Server 2012 Launch Wave. If you just check out www.microsoft.com/SQL that is a good place to start. There you can download the latest pre-release sets of SQL Server 2012 and do a bit of testing. You can also Bing the SQL Server 2012 Developer Training Kit, that will take you to a nice thin installer that will publish a lot of the developer oriented training content that folks like Greg and myself are developing to get people up to speed on this new release.
Greg Low: Outstanding, listen thank you so much for your time today Roger, we are looking forward to it.
Roger Doherty: Thanks a lot Greg we will see you soon.
Phone: 1300 SQL SQL (1300 775 775) l International +61 1300 775 775 l Fax: +61 3 8676-4913
Copyright 2017 by SQL Down Under | Terms Of Use | Privacy Statement