Dot Net Mafia

Group site for developer blogs dealing with (usually) .NET, SharePoint, Office 365, Mobile Development, and other Microsoft products, as well as some discussion of general programming related concepts.

This Blog



Corey Roth [MVP]

A SharePoint MVP bringing you the latest time saving tips for SharePoint 2013, Office 365 / SharePoint Online and Visual Studio 2013.
  • Ionic app with Cordova Firebase plugin terminates on startup with Xcode 9 and iOS 11

    If you are trying out your Ionic / Cordova apps on iOS 11 with Xcode 9, you might run into an issue where the app terminates immediately.  After examining the logs, you will see that the plugin cordova-plugin-firebase is terminating the app because it cannot find the GoogleService-Info.plist file (even though it is there).   You’ll see an error like the following in your logs.

    The GOOGLE_APP_ID either in the plist file 'GoogleService-Info.plist' or the one set in the customized options is invalid. If you are using the plist file, use the iOS version of bundle identifier to download the file, and do not manually edit the GOOGLE_APP_ID. You may change your app's bundle identifier to '(null)'. Or you can download a new configuration file that matches your bundle identifier from and replace the current one.

    *** Terminating app due to uncaught exception 'com.firebase.core', reason: 'Configuration fails. It may be caused by an invalid GOOGLE_APP_ID in GoogleService-Info.plist or set in the customized options.'

    If this happens to you, check your version of the plugin.  On one machine, I had 0.2.4 and on the other I had 0.2.1.  It worked on 0.2.4 so removing the plugin and re-adding it fixed it for me on the affected machine.

  • How to: Fix Something went wrong error in Outlook Customer Manager

    I actually use Outlook Customer Manager (OCM) quite a bit to keep track of my leads for my product BrewZap, a custom mobile app platform for breweries.  Unfortunately, it’s not uncommon to run into an error when launching it that says “Something went wrong”. 

    Screen Shot 2017-09-18 at 7.23.03 PM

    The problem is that sometimes this error will even occur after you close an d restart Outlook.  If that happens to you, then open up Internet Explorer (yes IE), and go to Internet Options.  Then click on Delete under Browsing History.  Check all of the boxes and then restart Outlook.

    Screen Shot 2017-09-18 at 7.27.05 PM

    If all goes well, Outlook Customer Manager will start again and you can use it.again.

    Screen Shot 2017-09-18 at 7.28.10 PM

  • Running an Ionic 2 PWA using Azure Web Sites

    You can host an Ionic 2 Progressive Web App (PWA) pretty easily on Azure Web Sites (App Service).  If you aren’t sure where to get started, take your Ionic 2 project and add the browser platform if you haven’t already.

    ionic platform add browser

    Now, you can test it locally by running against the browser platform.

    ionic run browser

    Running it on Azure really is just a matter of copying your files to your Azure Web Site via ftp.  You can get the username and address to connect to from your App Service properties.  Connect to it and be sure you change to the /site/wwwroot folder.  This is where the files from your app will go.  To will upload your files from the platform/browser/www/build folder.  Before you copy your files though I recommend you do a production build with the --prod command.  This will make the size of your JS files considerably smaller.

    ionic run browser --prod

    Now copy your files to the FTP site and go to the corresponding URL in your browser.  Your app should be working there. 

    There are a few mime types that you need to configure so that the Ionic fonts and any other files get served by IIS properly.  You do this in by creating a web.config.

    <?xml version="1.0"?>
          <mimeMap fileExtension=".json" mimeType="application/json" />
          <mimeMap fileExtension=".eot" mimeType="application/" />
          <mimeMap fileExtension=".ttf" mimeType="application/octet-stream" />
          <mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
          <mimeMap fileExtension=".woff" mimeType="application/font-woff" />
          <mimeMap fileExtension=".woff2" mimeType="application/font-woff2" />

    If you are working with DeepLinker, you may consider using a path-based location strategy instead of the standard hash based.  This effectively removes the hash (#) symbols from all of your URLs.  However, additional configuration will be required.  That’s because IIS hosting your site in Azure will give you a 404 error when you go any of the routes you have defined.  You need to redirect your routes to index.html to work. I have found that the routes in the web.config listed below pretty well.  If you are using query strings you might run into issues with these routes though so you may need to do some additional configuration.

    <?xml version="1.0"?>
          <mimeMap fileExtension=".json" mimeType="application/json" />
          <mimeMap fileExtension=".eot" mimeType="application/" />
          <mimeMap fileExtension=".ttf" mimeType="application/octet-stream" />
          <mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
          <mimeMap fileExtension=".woff" mimeType="application/font-woff" />
          <mimeMap fileExtension=".woff2" mimeType="application/font-woff2" />
            <clear />
            <rule name="AngularJS Routes" stopProcessing="true">
              <match url=".*" />
              <conditions logicalGrouping="MatchAll">
                <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />
                <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" />
                <add input="{REQUEST_URI}" pattern="^/(api)" negate="true" />
              <action type="Rewrite" url="index.html" />

    Running your PWA in Azure works a little bit differently, but once you have it configured, it’s a good solution.  If you run into any issues, turn on diagnostic logging in Azure and watch the log streaming to see what is happening.  Be on the lookout for scripts and CSS files returning a copy of index.html instead of what they are supposed to.  You can easily verify this from the developer tools of any browser.

  • How to: Get a Microsoft Graph Access Token when logging into Azure Active Directory with Azure App Service

    If you have used Azure App Service, you will have loved how easy it is to set up authentication to providers such as Azure Active Directory and Facebook.  It lets you get started with a single line of code.  You can literally login with a single line of code like the following:

    client.login('aad').then(results => {     // successful login 
    }, error => {     // login error 

    This will give you an id_token that you can then turn into a access_token by calling the /.auth/me endpoint with a REST call.  However, that access_token won’t have access to anything even though you configured App Service to use an App that has requested specific permissions.  CGillum from Microsoft pointed me in the right direction with his post to access the Azure AD Graph, but the Microsoft graph required some tweaks.

    You start by going to the Azure Resource Explorer.  However, this assumes you have already configured your App Service app to use your particular Azure AD Application that you are creating.  Find your app service app in the hierarchy and then open /config/authsettings and click Edit.  If you haven’t set your clientSecret yet, you can do so now (although I am not 100% sure it’s required).  However, the key parameter is to set additionalLoginParams with the following JSON array. 

    ["response_type=code id_token",  "resource="]

    This tells /.auth/me to give you the proper access_token when you call it.  You can also get a refresh token this way at the same time.  Once you have made the changes click the PUT button to send the changes back to the service.  Your should look something like this.

    Screen Shot 2017-01-12 at 9.19.14 AM

    Now, when you login again and call the /.auth/me endpoint, you’ll get additional data including an access token that works with Microsoft Graph.  If you have logged in before with this particular username and app, you will want to sign out and log back in again to make sure the permissions that you specified in your application get granted.  You may need to add the query string parameter prompt=consent on the login page to get it to prompt you for the new graph permissions.  Otherwise, you’ll get an access token that won’t work with the Microsoft Graph.

    Screen Shot 2017-01-12 at 9.12.36 AM

    As you can see in the screenshot above, the object returned has a lot more information in it than before.  There is nothing particular sensitive in this screenshot either since this is just a demo tenant. 

  • Ionic 2 debugging on Android with Visual Studio Code

    If you’re new to Ionic 2, you might have encountered some issues getting started debugging.  While you can usually debug fairly easily in the browser, when it comes to debugging on the device, there are some extra steps.

    First, install the extension ionic2-vscode. This will give you the options in the debug menu to start the process. 


    However, before you start that, you need to make a change to package.json so that your breakpoints will be hit.  Change your “build” line to the following:

    "build": "ionic-app-scripts build --dev",

    The key there is to add the --dev parameter.  Now you can hit the debug button and wait.  It will take a while to startup but once it is running, the app should be on your device and your breakpoints will get hit.

    If you have never deployed to your Android device before, you might try running “ionic run android” at the command line first to see if you have any other issues to resolve.

  • Office 365 Groups or Team Sites? No need to have that discussion any more!

    Any time Microsoft releases a new feature that has an overlap with a new feature, we see a flurry of fluff in the form of blog posts and even sessions on which feature to use when?  When Office 365 Groups came out, this was no exception.  What has changed?  At the Future of SharePoint event, Microsoft announced that every group in Office 365 will benefit from an associated Team Site.  Every Office 365 Group you create will get a new modern Team Site provisioned that shows a clear linkage to the Group.


    That’s pretty cool and should help eliminate the confusion on what to use when since you no longer have to make a decision.  Microsoft has also stated that existing Office 365 Groups (as in the ones you have now) will also get a Team Site associated with them as well.  This means whatever you are doing now, it’s ok.  You’ll be in good shape when the new features are rolled out.

    The updated Team Site home page provides a quick way to find the most important and relevant content on the site.  Content and news can be pinned to the front page and the Office Graph is baked right in to highlight activity relevant to you.  What’s even better is that you can even access it through the new SharePoint mobile app.

    Another exciting change from the new Team Site experience is that they will be provisioned faster.  Whereas, it used to take several minutes to provision a site collection, now it should only be a matter of seconds. 

    Between the Office 365 Group, the new team site home page, and existing team sites, we haven’t seen quite how all of this ties together yet.  It will be interested to see where things go.

    This isn’t out yet, where should I put content right now?

    Ok, so the conversation is not quite dead yet.  For now, if you need features that are only in Team Sites such as workflow or metadata, use a Team Site.  If you don’t care about Metadata and the document library in Office 365 Groups is good enough for you, then use that.  As you can see, Office 365 Groups is really starting to tie everything together.

  • One key takeaway from SharePoint 2016 General Availability

    This morning at the Future of SharePoint event, Microsoft announced the General Availability (GA) of SharePoint Server 2016.  While many customers have transitioned to the cloud with Office 365, on-premises SharePoint is still very real and alive.  As an end user, you might have looked at SharePoint Server 2016 and wondered “Why bother?”.   There really isn’t many new features that the end user is going to get excited about.  Yes, you’ll get a nice new suite bar, but most of the rest of the features the user cares about are linked to hybrid scenarios.

    What you should take away from today’s event is that you are not upgrading to SharePoint Server 2016 for what you get today.  You are upgrading for what you get in the future.  Whereas previous versions of SharePoint Server were very much static and didn’t change over the three year release cycle, this version is very different.  This version lays the foundation for new features to be delivered through Feature Packs.  These Feature Packs will bring new features to your on-premises SharePoint farm without having to wait until the next big version.  Microsoft even plans on delivering a set of capabilities specific to the needs of on-premises customers.

    Don’t worry, new features won’t just show up overnight like in Office 365.  Instead, as a SharePoint administrator, you’ll be able to have control over which features you enable in your farm.  This will give you time to plan and communicate change accordingly.  For whatever reason you run on-premises SharePoint, this should be an exciting announcement as it means you won’t get left in the cold waiting for the latest killer feature.  Does that mean, every feature from Office 365 is coming down to on-premises? No.  Some features simply aren’t feasible on-premises.  That’s why the hybrid story is so important.  However, it does mean, you’ll get updates on-premises faster than ever before.

    Feature Packs will be delivered through the public update channel starting in 2017.  Microsoft will announce more details about the upcoming Feature Packs in the coming months.  To get the new Feature Pack, your company will have to have purchased SharePoint with Software Assurance.  For Enterprise customers, that’s probably most of you.  You’ll notice this is similar to the model that Windows 10 is using and the way it updates as well.

    There is an exciting road ahead for SharePoint.  Be sure and read everything about it in case you missed any of it.

  • About to start a major Intranet project? Take a step back and see what’s coming to SharePoint

    For years, enterprise have been spending huge amounts of money and time building their Intranet on top of the SharePoint platform.  Intranets take lots of planning and development even to get the most basic of functionality.  Throw in heavy branding and responsive design and you’re looking at a significant investment.  Launching a new Intranet has just been too long of a process with too many technical hurdles, but things are going to improve.

    SharePoint team site and mobile app

    Microsoft has announced a new page publishing experience that will make a lot of publishing scenarios much simpler.  It provides an updated page authoring canvas that allows for simple branding and page layouts while still having some extensibility hooks.  Best of all what you create here is responsive and works seamlessly with the new SharePoint app.  Out-of-the-box you will be able to quickly creates pages without a bunch of up-front configuration first.  Remember what you had to do before?  You know, create content types, page layouts, master pages, workflows, page libraries and more.  Not to mention, Microsoft has been telling you to stop customize master pages for some time now.  You want to go back to that?

    SharePoint becomes a first class citizen in Office 365 – a few years ago, you might nave noticed that references to the actual term SharePoint were few and far between in Office 365.  The only real entry point to SharePoint was through the Sites link in the app launcher.  That’s changing.  The link will now say SharePoint in it and so will the navigation in the suite bar.  Clicking on the link will take you to the new entry point or SharePoint Home which pushes sites that you frequent right to the center.  It also tracks sites you are following as well as provides links to other sites.  This should make it easier to find many of the sites you need without an organization having to put a lot of thought into the information architecture.  While it won’t outright replace it.  It’s a great starting point for organizations who have never bothered to really set anything up like that.

    SharePoint home page with activity - 100 percent

    But my Intranet *MUST* do X or we can’t use it – great!  Keep doing what you are doing and customize the whole thing the way you used to.  However, if your requirements are flexible, the first release may be just what you need.  If you are looking for a simple page authoring canvas with little ramp-up, I think you are going to like it.  This upcoming release, I think will come close to hitting the “80%” mark where it’s good enough to get people publishing content quickly and easily.  If you have advanced needs and you find that you need something more, then you are probably going to have to go back to the conventional publishing model while you wait for new features to come online in future releases.

    The Intranet, not just for huge Enterprises any more.  I have worked at a number of consulting companies and there is good money in helping clients build out elaborate Intranets.  Sure a lot of that comes down to the planning and design, but the implementation was just overly complex.  Just as Office 365 has brought features like Team Sites and Exchange into small organizations years ago, the new modern pages experience is making the Intranet broadly available to smaller organizations.  That’s pretty exciting.


    We are about to start a big Intranet project or are in the middle of one – This is a tricky place to be in and your organization will have to make decisions about timelines.  The new SharePoint Home entry will be here soon but the modern page publishing features are further out in 2016.  Although there is limited information right now.  Try and take a look at your requirements and see if the new Modern pages experience will meet your requirements.  If you don’t think it will, them continue implementing your new Intranet as usual and take another look at it in the future.  If you think it does meet your requirements, then maybe take a step back and see what happens and use this as an opportunity to fully vet out your define phase.  Ultimately, it comes down to your organization’s priorities, requirements, and timelines.

    The future of SharePoint is bright.  Today has taught us that Microsoft is continuing to invest in the product as a core.  If you missed any of the announcements, be sure and read through them to find out everything that’s coming.

  • Getting started with Azure App Service and Easy Tables

    Azure App Service makes it easy to build a back-end for a mobile application.  One of my favorite features of it is Easy Tables.  Easy Tables makes it easy to create tables in a SQL Azure database and comes with an SDK for several mobile platforms that make it easy to access it.  You can make calls to it using REST, but there are APIs for Apache Cordova, Xamarin, as well as native development.  Some of the APIs even support offline synchronization.  If you are looking to prove out an idea, Easy Tables makes it well, easy.

    There is some great documentation out there for getting started with Easy Tables as well.  However, I ran into a few stumbling points in my early experiences that I thought I would share.  These instructions assume, you have created an Azure App Service app.  If you don’t have one just look for App Services in your Azure portal and click New.  In our case, we are going to be working with a node.js back-end.

    Setting up the database connection

    Setting up the database connection should be simple, but unfortunately, it’s not due to issues in the configuration UI.  Before you go to set up the connection, you will need to create a SQL Server to host your database if you don’t have one already.  This isn’t terribly complicated just try to pick a region close to the one hosting your App Service App.  After your database is set up, you need to create the database to host your data.  You can use the free pricing tier too to try things out if you haven’t created a free database in this particular region yet (you only get one per region).


    Once you have your database set up, you are ready to configure a connection to from Azure App Service.  Effectively, you are creating a connection string.  To set up the connection, go to your App Service App and click on the Easy Tables link under Mobile.


    It will then initialize Easy Tables if needed and prompt you to click continue to set it up.


    Here is where, it will prompt you to connect to a database.


    After this, it will prompt you to select your existing SQL Server and then database.  The final step is to enter in the username and password you set on your SQL Server and database.  Specify a connection string name (whatever you want) along with the credentials.  Click ok and hope for the best.  I mentioned I had problems with the UI right?  The first time I did this, I couldn’t get it to save my password because it had a semicolon (;) in it.  Remember, how I said it is building a connection string underneath?  Semicolon is a delimiter for connection strings and the UI doesn’t handle that at all.  It just fails.  Hopefully this step works for you.  If it doesn’t there is a way that you can manually set the connection string using Resource Explorer.  That’s more detail than I want to go into today.  If you run into it though, feel free to ping me or leave me a comment and I’ll provide the details.

    Once your database connection has been set up, click on the Initialize App button.  This will destroy any files that may already exist on your App Service App so don’t click it if there is something you need there.  Effectively this sets up folders such as /tables and /api as well as some of the basic node.js configuration.  It also creates a test table called TodoItem.

    Creating new tables

    There are a few ways to create new tables but not all of them have the desired results:

    • Create the appropriate files in the tables folder
    • Create the table through the Azure Portal in the Easy Tables section
    • Manually create the table in SQL Server

    I would say the preferred way to create the table is by creating the appropriate files in the tables folder.  The node.js back-end will create new SQL table for any .js file you add to the tables folder (assuming the appropriate supporting files are present).  For example if you create a file named Event.js, you will get a SQL table called Event.  If you create a file named Contact.js, you will get a SQL table called Contact.  You get the idea.  There is a little bit more to it though.  Let’s look at the steps involved. 

    First, you need to be able to edit the files of your node.js App Service app.   You have several ways to do this.  I recommend setting up a local git repository and pushing files up to your App Service App.  You can configure that along with credentials in the Deployment source setting of your App Service App.  You can find more about setting up the node.js back-end and deployment in this article.  However, you can also just edit the files directly from Easy Tables.  If you already have a table created.  Click on the table and then click the Edit script button to edit the files directly in Visual Studio Online.


    Here you can edit the files directly and they are saved immediately.  I am going to start by create a table to store event information named Event.  That means we need to create a file named event.js.  Here is what the starting file looks like.

    var azureMobileApps = require('azure-mobile-apps');
    var table = azureMobileApps.table();
    module.exports = table;

    According to the documentation that is all that is required to get started with a new table.  Now, you might be wondering where are the column names and types?  Technically, they aren’t required.  You see Easy Tables will create new columns on the fly when you make your first insert.  This is great to try things out but not really what you want to do in a production environment.  So I like to specify my own columns.  You can use types such as string, date, boolean, and number.  More complex types aren’t supported by the API.  To create your columns put them in an object and assign to the columns property of table.  Then be sure and set dynamicSchema to false so that it won’t create new columns on the fly.  Set these values before calling module.exports.

    table.columns = {
    	"title": "string",
    	"start_date": "date",
    	"end_date": "date",
    	"description": "string",
    	"image_url": "string",
    	"cost": "string",
    	"event_type": "string",
    	"send_notification": "boolean"
    table.dynamicSchema = false;

    Don’t worry about creating columns for things like and Id, created and modified dates, or even soft delete.  Easy Tables will create those columns for you automatically.

    Before Easy Tables goes and creates your back-end table, there are a few more steps.  First, you need a corresponding .json file.  In our case it would be Event.json.  This contains some basic properties such as if soft-delete is enabled and whether certain operations require authentication.  I found no documentation whatsoever that said this file was required.  However, in the TodoItem samples out there on git hub the file was always there.  Here is what it looks like.

    {   "softDelete" : true,   "autoIncrement": false,   "read": {     "access": "anonymous"   },   "insert": {     "access": "anonymous"   },   "update": {     "access": "anonymous"   },   "delete": {     "access": "anonymous"   },   "undelete": {     "access": "anonymous"   }

    If you want to authentication to be required, you can change the access for the corresponding operation to “authenticated”.  However, we’ll cover that in a different post.

    At this point, you should see an entry for your new table in the Easy Tables section of the Azure portal.  However, the underlying SQL table does not exist yet.  In fact, it doesn’t get created until you make your first API call to it.  That part took me a while to figure out.  Maybe there is a way to automate it getting created but in my experience it doesn’t happen until then.  There is actually a parameter called seed that will let you put sample data into your table, but I have never successfully gotten it to work.

    Calling the API using JavaScript

    In my example, I am using Apache Cordova.  It’s easy to get started with the Cordova SDK and Azure App Service.  Just add the plugin and create a client object and start querying.  Take a look at this post for more details.  If you are using straight HTML / JavaScript, the code is basically the same as well.  First, create a client object using the URL to your App Service.  You can get this from the Azure Portal.

    var client = new WindowsAzure.MobileServiceClient(azureMobileClientUrl);

    Now, we can call getTable with the table name to get the event.  Selecting data from the table is easy, just call .read() to get everything.  It has a promise attached so that you can work with your results and errors accordingly.

    var table = client.getTable('event');
    .then(function (events) {
    	// do something with events
    }, function (error) {
    	// do somethign with errors

    If you want to filter your data just add a where clause.  Each item you add to the collection will be treated as an “AND” operator.  Note, that it is only simple equal comparisons though.

    var table = client.getTable('event');
    return table
    	.where({ id: myId})
    .then(function (events) {
    	// do something with events
    }, function (error) {
    	// do somethign with errors

    You can also use .orderBy() to control the order of the data as well.  It can be used in conjunction with the where clause if desired.

    var table = client.getTable('event');
    return table
    .then(function (events) {
    	// do something with events
    }, function (error) {
    	// do somethign with errors

    Making any of the above calls is enough to get your table created.  You can then go to Server Explorer –> Azure –> SQL Databases and verify that the table was created.  This is a great way to look and see what data is there as well.

    Have a look at the SDK reference for inserting and updating data.  You simply need to create an option with the matching column names and specify the values and call .insert() or .update() accordingly.  Remember you don’t need to specify values for the Id or any of the other fields Easy Table creates.  The response you get back will be the inserted or updated data.

    var table = client.getTable('event');
    return table.insert({
    	title: 'Event Title',
    	description: 'This is an event description',
    	start_date: new Date(),
    	end_date: new Date(),
    	event_type: 'Special Event',
    	send_notification: true
    	.then(function (response) {
    	}, function (error) {

    If you need to delete data, the one thing I ran into is that it only works with the Id field.  If you try to delete based on some other value, you’ll get an error.  If you need to delete based on some other field, you will need to create your own API.  We’ll cover that in another post.

    var table = client.getTable('event');
    table.del({ id: id })
    .then(function () {
    	// deleted
    }, function (error) {
    	// error

    Mobile Apps in Azure App Service is an evolution of Azure Mobile Services.  I often find there is more documentation on that since it’s been around longer.  For example, take a look at this article on working with mobile data as it has a lot more examples.


    Azure App Service Easy Tables make it super easy to get started creating a back-end for your mobile app.  Give them a try and see what you can create.  If you are looking for some samples, be sure and check out the Azure Mobile Apps repository in GitHub.

  • Complete list of Office 365 Connectors for Groups

    Office 365 Connectors for Groups provide a way to pull in information from various sources such as Twitter, GitHub, UserVoice, and Trello.  Office 365 Connectors deliver external content straight to the inbox of your Office 365 Group.  Last fall, I talked about how to use Office 365 Connectors while they were in preview.  However, now they are available for customers with tenants in First Release. 

    Now, there are over 50 Office 365 Connectors available.  Many for services I have never heard of (maybe you have).  Here is the complete list (as of now):

    • Aha! – Manage work with a visual roadmap.
    • Airbrake – Captures errors and aggregates the results for developer review.
    • Alerts by MarketSpace – Company monitoring for marketing, PR, development and investment teams.
    • AppSignal – Track throughput, response times, and error rates in your apps.
    • Asana – Task and Project Management.
    • BeanStalk – A code hosting workflow
    • Bing News – Search news for your work and personalize email update.
    • BitBucket – Manage and collaborate on your code projects.
    • Biztera – Biztera simplifies decision-making with cloud software that streamlines approval workflows
    • BMC TrueSight Pulse (Boundary) – Monitor cloud and server infrastructure with real-time visibility.
    • Brandfolder – Gain great controls of your brand assets, all from Office 365.
    • Bugsnap – Track issues for your web and mobile apps.
    • Buildkite – Smart automation for your software development processes
    • Caller Zen – Create an SMS call center in seconds.
    • Chatra – Modern Live Chat Software.
    • CircleCl – Build, test, and deploy software continuously.
    • Clearlogin – Integrate Clearlogin with your GroupMail to receive real time notifications about use activity.
    • Cloud 66 – Build, deploy, and manage applications.
    • Codeship – Automate the workflow for development and deployment.
    • Crashlytics – Track errors in mobile apps.
    • Datadog – for Dynamics CRM
    • Delighted – The fastest way to gather actionable feedback from your customers.
    • Dynamics CRM Online – Manager your customer sales, marketing, and service relationships.
    • Enchant – Provide customer support with chat and email.
    • Envoy – The new standard for visitor registration
    • GhostInspector – Build or record automated testes for your web site.
    • GitHub – Manage and collaborate on code projects.
    • GoSquared – Simple, yet powerful analytics for your online business
    • Groove – Provide customer support with tem collaboration.
    • HelpScout – Provide customer support through email messages.
    • Heroku – Build and run applications in the cloud.
    • HoneyBadger – Exception and uptime monitoring for your web apps
    • Incoming Webhook – Send data from a service to your Office 365 group in real time (this is how you can bring in your own data to Office 365 groups)
    • IQBoxy – Intelligent Expense Management with Real-Time Receipt Processing and Expense Reports
    • JIRA – Gather, organize, and assign issues detected in your software.
    • JustReply – Simple time tracking for teams.
    • Librato – Real-Time Cloud Monitoring
    • Logentries – Search and monitor log data from any environment.
    • Magnum Cl – Build, test, and deploy software continuously.
    • MailChimp – Email marketing service
    • MeisterTask – An intuitive task management and collaboration tool for agile projects
    • OpsGenie – Alert and Notification solution
    • PagerDuty – Track and manage incidents, and define escalation policies.
    • Papertrail – Track and manage incidents and downtime issues.
    • Pingdom – Track uptime/downtime and performance of web sites.
    • Pivotal Tracker – Track progress in agile projects.
    • Raygun – Monitor crashes in web and mobile apps.
    • RSS – Get RSS feeds for your group.
    • Runscope – Log, monitor, and measure your API usage.
    • Salesforce – Manage sales opportunities
    • Sentry – Capture and aggregate exceptions in code.
    • Stack Overflow – Follow tagged questions and provide answers.
    • – Statues page for App or Website
    • Subversion – An open-source revision control system
    • TestFairy – Test mobile apps and retrace events that precipitate errors.
    • Travis Cl – Run tests and deploy apps.
    • Trello – Manage to-do lists and tasks all in one place.
    • Twitter – Received messages called Tweets.
    • UserLike – Provide live chat support for web sites and mobile apps.
    • UserVoice – Product management and customer support tool.
    • WakaTime – Get updates about your team’s daily coding activity from GitHub, GitLab, or Bitbucket.
    • Wunderlist – Organize and share your to-do lists.
    • Yo – Communicate with others in simple fashion.
    • Zendesk – Zendesk brings companies and their customers closer together.
    • ZootRock – Curated Content in your niche to share to your Groups.

    As you can see that is quite a list of Office 365 Connectors.  Are there any in that list that you will find useful?  Is there anything you wish was on the list?

  • 10 tips a consultant can give to a company issuing an RFP

    In my days, I have seen a lot of RFPs, RFIs, and RFQs.  I’ve seen RFPs that are simple, complex, small, large, strict, well-executed, and no-so-well-executed. In my experience, there are a lot of things you can do to ensure you get the best possible response.  If you are considering issuing a Request for Proposal (RFP) for your next project, consider the following before issuing your next one.

    1) Set realistic dates

    When you issue an RFP, it takes a heap of time for a consulting firm to mobilize a team, get the right people on the call, and prepare a response.  The more complicated your RFP response requirements are the longer it takes to prepare a response.  I’ve received RFPs which want a turn-around in less than a week.  When this happens, often one or more of the bidding firms will push back on you or simply decline to respond.  This often causes multiple extensions to the RFP deadline which honestly doesn’t make your company look good.  When your RFP process is not organized, it makes me question whether we want to do business with your company.  If you can’t run an RFP without a lot of issues, you may not be able to successfully run a project initiative or pay your invoices on time.

    Also pay attention to the dates you set.  Setting an RFP to be due the day after Christmas or during the Thanksgiving break is not cool.  You don’t want to work on the holidays. Your respondants don’t either.

    2) Don’t ask for references in your RFP response

    When applying for a job, I am not going to give you a list of references before I have even talked to someone at your company.  When responding to an RFP, it’s not any different.  It’s not that we don’t want you to vet us out and know our qualifications.  You have to understand references are difficult.  You are asking for us to ask our previous clients to take time out of their day and talk to you about a project on our behalf.  Chances are you aren’t the only RFP we are responding to at a given time either.  That means that client could be receiving multiple calls.  Think about it.  If you select my firm and we successfully implement your project, are you willing to be a reference for any number of random callers?  I doubt it.  Reference calls need to be scheduled and prepared for in advance.  I can’t just have you calling them blindly.

    Instead of asking for references, ask for qualifications.  If you have questions about the qualification ask the respondant to talk about it when you down-select vendors and go to orals.  At that point, it might be acceptable to ask to set up a reference call.

    3) Don’t require answers to questions that aren’t relevant

    You know what makes vendors question if they even want to respond to your RFP?  Required responses to a bunch of questions that aren’t relevant.  I once had an RFP for an Office 365 implementation ask questions such as “Does your product included printed manuals?”.  My response: “We plan on documenting your implementation.  We can print if off for you if you would like.” In the question period, I even asked about these irrelevant questions and the company insisted they all should be completed.  Make sure you are asking the right questions.

    Don’t ask too many questions either.  Keep in mind, that every vendor you pull in is going to mobilize a team of people to put together a response.  This likely includes time from the account manager, a vice president or two, an architect, and maybe an offshore team.  It’s not uncommon for the team to spend several hundred hours combined in their response.  For smaller companies that often means using resources such as the architect that are billable.  For that person they either are billing less or working overtime

    4) Respond to questions in a timely manner

    Potential vendors ask you questions to clarify their understanding of your needs.  Your answers are often crtiical to building their response.  If you don’t reply to answers until two days before the RFP is due, that is going to strain the respondants.  It also means you might not be getting the best response possible out of your bidders because they didn’t have adequate time to prepare it.  Try to get your responses back at leaast a week before the RFP is due.

    5) Stop asking for fixed-bid

    So the project you are working on is risky with a lot of unknowns?  Great, let’s slap a fixed-bid requirements on to the RFP.  That way if there is an issue, you can blame the consultants! 

    Do you not realize what happens when you do this?  You are automatically paying 20% more at the minimum.  The winning firm is going to do everything they can to lock in scope and assumptions so they don’t end up losing money on the deal.  Not to mention, that there is rarely enough detail in terms of requirements in the RFP itself to make an accurate estimation.

    If for whatever reason the consultant can deliver on the fixed bid, eventually you are going to hit a point where you are straining the relationship with your consulting firm.  This is when talks of lawyers come in and then neither of you want to do business with each other ever again.

    Your best bet is to fixed bid a scoping engagement to properly map out the requirements and technical design.  From there, you can get a more accurate estimate on the implementation and you will likely end up paying less.

    6) Don’t include too many vendors

    Keep the number of vendors down to a minimum.  Keep in mind, you are asking a lot of people to jump through a lot of hoops at every firm you contact.  You’re just creating more responses that your RFP team has to read through and rank.  They will all start to look the same after a while too.  Definitely, don’t let one more vendor in because some sales rep got wind late in the process that you were having an RFP.

    7) Stop asking for active and past litigation

    Are the lawyers at your company going to provide a list of all law suits you have ever been involved in to someone random?  Why do you expect us to?  Companies get sued all the time, especially the larger ones.  I don’t see many firms providing you this information.  If you really want the details, you can go dig it up.

    8) Don’t be so picky about the response format

    I think it’s ok to ask respondants to limit the length of their respose to X amount of pages.  It’s not cool to have phrases like “adding a column to this spreadsheet is grounds for being kicked out of the RFP process”.  I understand you have to review multiple RFPs and you are trying to keep things consistent but specifying which font or a maximum file size of an exhibit is just silly. 

    9) Don’t ask for hard copies

    I’m looking at you government and health care companies.  In this day and age asking for a hard copy to be delivered in person or by courier is just silly.  Why don’t you just have us chisel it out the response in stone tablets?  Let us deliver the response electronically via e-mail or through and RFP response portal.  If you just want to print it out because you want to scribble notes on the paper, may I introudce you to the Surface Pro 4.  You can use digital ink to mark up or highlight the response as needed.  If you really do need paper copies, print it out yourself as needed.  Are you just trying to save on printing costs?  If you are that might be a bad sign you don’t need to do this project. 

    10) Don’t issue an RFP if you are just going to pick the encombant

    Again, be mindful of all of the time you are using of people.  RFPs often require sizable teams and late hours to meet the deadline.  If you know you are going to pick the encombant before even starting the RFP process, that is absolutely bad form.  There is a special place in hell for companies that issue an RFP and then just pick the vendor they already had.  Work out whatever issue you had with your vendor and just go with them and skip the RFP process.  If it is procurement pushing you to issue an RFP so that you will get the “best deal”, time to get a new procurement department. :)  At least, let the respondants know that there is an encombant at play.

    Bonus Tip – Don’t ask for names of resources

    Don’t ask for the names of resources that will be staffed in your RFP response.  Do you really think we have an entire project team sitting around on the bench just waiting to be staffed on your project if we happen to win it?  Our company wouldn’t be in business long if we did.  Keep in mind you aren’t the only RFP we are responding to at a given time.  If you really thought we had resoures lying around at all times multiply that by the number of active RFPs and that’s a lot of people not bringing in revenue.  Nothing shows that your company has no idea how the consulting industry works more by asking for the names of resources in an RFP response.  I can almost assure you that whomever we list as a name won’t be the person you get at the start of an engagement.  If you want to know the background of the people you are staffing wait until you select a vendor and then ask for a profile.

    Do you really need to do an RFP?

    You have smart people at your company, but maybe you just don’t have enough of them.  That’s why you are issuing an RFP right?  Maybe you are looking for a certain skillset you don’t currently have?  Before opening up an RFP for your next project, ask if you really should.  Your smart people should already have established relationships with a handful of vendors.  Call some of them, tell them what you are trying to accomplish and just ask for a proposal.  Setting up an entire RFP process is long and overly complicated.  If you already have an established relationship with a few vendors, why shop it out?  If they have done good work in the past, they probably will in the future as well.  If anything you know what you are dealing with.  Do you really want to go with a different vendor for every project just so you can ensure you get the best price?  I understand there are other reasonse to issue an RFP, but you really have to ask yourselfif it’s really worth it?

    Posted Feb 12 2016, 10:21 AM by CoreyRoth with 1 comment(s)
    Filed under:
  • Stalking your favorite celebrities with Office 365 Connectors for Groups

    Office 365 Connectors are a new extensibility point for Groups.  They provide a way to pull in information from various sources such as Twitter, Bing News, and Trello.  The information they provide will be dumped right into your group’s inbox.

    Office 365 Connectors for Groups are new.  Really new.  So new in fact that you have to enable them with a special query string parameter, EnableConnectorDevPreview=true.  The easiest way to enable them is to go the URL below.

    When you go to the URL below, you’ll see a new option for Connectors under the … menu of your group.


    There, you will see a list of all connectors currently available.  This includes things like Bing News, Twitter, JIRA, RSS, Trello, Github, and a few others.


    Since we are looking to stalk celebrities, we’ll make use of the Twitter and Bing News connectors.  I want to know any time, the celebrities I am stalking send a tweet out.  When they do tweet something, what they say will automatically be delivered to my Group’s inbox.  If you are subscribed to the Group the will end up in your inbox as well.

    Let’s add the Twitter connector.  The first thing you will need to do is add an account if you haven’t provided one yet.  Sign-in with your credentials to proceed.


    Now, let’s configure the celebrities we want to stalk.  By adding their twitter handles (minus the @) separated by commas.  You can only enter up to 50 characters in the box, so if you have lots of people to stalk, you can just add another instance of the Twitter connector.


    You can also Track hashtags this way as well as mentions and retweets.  It’s fairly versatile.  When you are done, click the Save button.  When you click it, it will give you no response and just wait there for a few seconds.  Remember this is a developer preview.  If you click it multiple times, you will end up getting multiple connectors, so just be patient. 

    Now, I want to see whatever they say in the news about my celebrities so I am going to add a Bing News Connector.  This will give you a digest once a day of whatever terms you ask it to search on.  Here I have used the Bing News Connector to send me the latest on Neil Patrick Harris.


    Once you are done adding connectors, you can see the ones you added on the Connectors page.  You can reconfigure them here as needed as well. 


    When you return to your group inbox, there will be notifications that your connectors are active.


    Now, you just wait until your celebrities make their move (and for the connectors to fire).  When I first configured them last week, they didn’t work.  Remember, we are in a developer preview.  They are started working yesterday and the data started coming in bulk (and so did the notifications).  In my experience today, the twitter connector usually delivers tweets within 5 minutes of when it occurred.  The Bing News connector delivers a digest whenever it feels like.  I’m not really sure what time zone it is executing in but I assume it’s the local one.

    After your connectors have started providing data, here is what it looks like.  For example, here is the twitter connector providing Wil Wheaton’s latest tweet.  You’ll get one entry in the inbox for each tweet.  The formatting of URLs, usernames, and hashtags could use some improvement, but all of the information is there.


    Here is what the results look like in Outlook 2016.


    It’s up to the connector to provide the formatting, so you can see the results from Bing look a bit different.


    All kidding aside, I hope you can see the power of Office 365 Connectors.  Instead of stalking celebrities, you could use this same technique to quickly provide information about company’s competitors right to your inbox.  There is also a developer story using Webhooks.  The last I checked the URL it provided me didn’t work yet, but we can expect it is coming soon.  Office 365 Connectors are powerful and I am looking forward to using them more.

    I hope you found this look at Office 365 Connectors useful.  Happy stalking!

    Stalk me on twitter: @coreyroth.

  • Drag and Drop into Office 365 comes to Edge on Windows 10

    Looking back to the SharePoint 2013 launch, drag and drop into a document library was one of the hit features.  As many of us have transitioned to Office 365, it’s a feature we expect to be there in our document libraries and OneDrive.  However, when Windows 10 was released at the end of July, the feature was noticeably missing from Microsoft’s newest browser, Edge.  Drag and drop support didn’t work in Office 365, OneDrive, or anywhere else for that matter.

    With the Threshold 2 (TH2) fall update for Windows 10 last week, we now have drag and drop support.  For Windows Insiders, this support has been there in preview builds for a while but it hasn’t been talked about much.


    Say what you want about Edge, I have had pretty good luck with it.  Some people act like it is completely unusable, but I can get most of my day to day tasks done with it.  I’ve used it to replace Google Chrome for almost all of my tasks.  Does it still have issues and does it still need more features?  Absolutely, but it gets the job done for the most part.  If you have discounted it before, give it another try.  It’s constantly being updated.

    One cool feature that came with the fall update is the ability to cast media to a device directly from Edge.  What this means is that when you visit a page with a video on it, you can select cast to device in the menu and choose any Miracast or DLNA supported device to view that content on the remote screen. 

    If you’re not familiar with Miracast, it lets your wireless transmit what’s on your screen to a TV or monitor.  This is included in a lot of devices now such as the Roku.  You can also pick up a Microsoft Wireless Display Adapter as well.  What’s nice is that you can cast a video to a TV from your Windows 10 device and you can still use it while the video is showing.  With Miracast support before, you had to project your entire desktop or extend it just like you did with an external monitor.


    I think it’s a useful feature.  Similar to what Chromecast has been doing for some time now.  However, Miracast is a more open standard and supported across a variety of devices.  The nice thing is that it doesn’t require any additional drivers, plugins, or software to make it work.

  • SharePoint 2016 installation first look

    Today Microsoft released the IT Preview of SharePoint 2016.  We’re going to look at the install process today and point out any differences between previous versions of SharePoint.  You can find out more about SharePoint 2016 from the Office blog post.


    I created a new virtual machine running on Windows 10 Hyper-V.  This virtual machine is running Windows Server 2016 Technical Preview 3.  On this server, I have promoted it to a domain controller using Active Directory Directory Services.  I have also installed SQL Server 2014 R2.

    Installation of SharePoint 2016 IT Preview looks similar to previous SharePoint installations.  When you mount the ISO, you will see a familiar splash screen.


    Installing the prerequisites

    The prerequisite installer looks similar to previous versions.  Click Next to continue.


    Accept the license terms.


    Wait and home it finishes successfully.


    Installation of the SharePoint 2016 IT Preview prerequisites is just as troublesome as previous versions.  I managed to generate a few errors and never did get it to agree the role was successfully installed.  I had to install the Visual Studio 2013 Redistributable myself.  There is a work-around for getting around the IIS role configuration step on the Known Issues in SharePoint Server 2016 Preview page.


    SharePoint 2016 Installation

    The installation is quick and easy.  It also looks similar to other versions.  Start by entering the product key.  Remember this was found on the SharePoint 2016 download page.


    Next, accept some more license terms.


    Specify the install location for your search index.  You can just use the defaults for this preview installation.


    Wait for the installation to complete.


    When installation completes, you will be prompted to run the Configuration Wizard.


    Running the Configuration Wizard

    The Configuration Wizard also looks similar but has a few changes.  Click Next to proceed.


    Now, create a new farm.


    Then, specify your farm passphrase.


    Specify your farm account, SQL Server name and database.


    This new screen allows you use the new MinRole feature. MinRole simplifies the server architecture of SharePoint 2016.  In this case, we are going to use a Single Server Farm (which shouldn’t be as bad as previous versions).


    Specify the details for Central Administration.


    Confirm your settings.


    When it finishes, you are ready to try out Central Administration.


    Now, we can start our configuration.


    That’s a quick look at the installation process of SharePoint 2016 IT Preview.  Be on the look out for my next posts covering changes in Central Administration and the UI.

    Follow me on twitter: @coreyroth

  • How to: Play an Office 365 Video from a Universal Windows App in Windows 10

    I’ve had a personal interest in figuring out how to play videos from the Office 365 Video Portal in a Univeral Windows App (UWP) in Windows 10 since Microsoft Ignite.  In reality, the process isn’t that difficult, but there wasn’t a lot of details out there on how to put the pieces together until recently.  Today, I am going to walk you through how to retrieve the first video in a channel and play it using a MediaElement

    Today, I combined elements I learned from Richard DiZerga’s Universal App blog post, Chakkradeep Chandran’s Ignite session, as well as Andrew Connell's MVC Sample.  Special thanks to all of them for having the pieces I needed to put all of this together.  Since this is a Universal app it should work across multiple devices in theory.

    This app could easily be extended to create a nice interface to browse a channel and pick a video.  If you’re familiar with data binding in WPF though that shouldn’t be hard for you, so I won’t cover that today.  We’re simply going to look at the necessary API and REST calls to get the secure playback URL.  Once we have that URL, we can have the MediaElement start playing the video.

    Getting started

    First, you’ll need to have installed Windows 10 as well as the final version of Visual Studio 2015.  We’re also assuming you have an Office 365 tenant, the Video Portal turned on, and you have some videos uploaded.  If you have never created a Universal Windows App before you can find it under your favorite programming language –> Windows –> Universal –> Blank App (Universal Windows).


    Playing a video from the Office 365 Video Portal involves several steps.  If you know some of the values already along the way (such as the Video Portal URL), you can skip some steps.  I am going to include all of them (which means this blog post is going to be long).  The steps we need to go through are:

    1. Authenticate with Azure Active Directory / Office 365
    2. Access the Discovery Service to get the RootSite ServiceResourceId
    3. Determine the Video Portal Hub Url (i.e.:
    4. Get the list of channels
    5. Get the videos inside a channel
    6. Get the video playback URL

    As you can see, that’s a lot of service calls.  Caching is a good thing.

    Connecting to Office 365

    There are no shortage of ways to authenticate with Office 365 and Azure Active Directory.  Windows 10 introduces a new way called the Web Account Provider.  This will provide us our token that we can pass to the Discovery Service as well as give us tokens to other services such as SharePoint or Outlook.  It will prompt the user to enter his or her credentials if we don’t have a token yet as well.

    To connect to Office 365 bring up the context menu on the project in Solution Explorer and choose Add –> Connected Service.  Select Office 365 APIs and then click Configure.


    The nice thing about the Connected Service dialog in Visual Studio 2015 now is that it takes care of all of the manual Azure Active Directory application setup.  You don’t need to go manually copy a ClientId, secret, permissions, or anything.  This makes the new developer experience much better.

    Now, you need to specify the domain of your tenant (i.e.:


    Clicking Next will prompt you for authentication.  Enter your Office 365 credentials and then you can proceed to the next step.


    Now, we need to tell Visual Studio to Create a new Azure AD application to access Office 365 API services.  This will register the new application in Azure AD and then we can request permissions.


    On the next six steps of the Wizard, we set the permissions that the app requires.  For the purpose of playing a video, we don’t need access to Calendar, Contacts, Mail, or My Files.  For Sites, we need the following permissions:

    • Read and write items and lists in all site collections
    • Read and write items in all site collections
    • Read items in all site collections


    For Users and Groups, select the following permissions:

    • Sign you in and read your profile


    Finally, hit Finish and Visual Studio will add the connected service to your project.

    Designing the Page Layout

    Now we’ll edit MainPage.xaml and add our controls to it.  I kept this simple by just adding a Button control and a MediaElement.  You can arrange them as you like.


    Authenticating with Web Account Manager

    Before we get too far, I am going to create a folder for the Models.  I am going to skip a formal Controller class today since we’re not doing any real data binding.  Normally you would have multiple controllers to pull up a list of channels, videos, etc.  We’ll use a repository class, VideoRepository, that will connect to Office 365 and retrieve our data and our models we’ll use to deserialize the jSon that comes back from the REST API. 

    The VideoRepository class will also handle authentication.  We’ll add the following using statements.

    using System.Net.Http;

    using System.Net.Http.Headers;

    using Microsoft.Office365.Discovery;

    using Microsoft.Office365.SharePoint.CoreServices;

    using Windows.Security.Authentication.Web.Core;

    using Windows.Security.Credentials;

    using Newtonsoft.Json;

    The first method we will create is GetAccessTokenForResource.  If you have looked at other Universal App Samples, this should look fairly similar.  The purpose of this method is to attempt to get a token silently (without prompting the user).  If it can’t, then it it will show the the Office 365 login and the user can login which will provide the token. 

    public async Task<string> GetAccessTokenForResource(string resource)


        string token = null;


        //first try to get the token silently

        WebAccountProvider aadAccountProvider

            = await WebAuthenticationCoreManager.FindAccountProviderAsync("");

        WebTokenRequest webTokenRequest

            = new WebTokenRequest(aadAccountProvider, String.Empty, App.Current.Resources["ida:ClientID"].ToString(), WebTokenRequestPromptType.Default);

        webTokenRequest.Properties.Add("authority", "");

        webTokenRequest.Properties.Add("resource", resource);

        WebTokenRequestResult webTokenRequestResult

            = await WebAuthenticationCoreManager.GetTokenSilentlyAsync(webTokenRequest);

        if (webTokenRequestResult.ResponseStatus == WebTokenRequestStatus.Success)


            WebTokenResponse webTokenResponse = webTokenRequestResult.ResponseData[0];

            token = webTokenResponse.Token;


        else if (webTokenRequestResult.ResponseStatus == WebTokenRequestStatus.UserInteractionRequired)


            //get token through prompt


                = new WebTokenRequest(aadAccountProvider, String.Empty, App.Current.Resources["ida:ClientID"].ToString(), WebTokenRequestPromptType.ForceAuthentication);

            webTokenRequest.Properties.Add("authority", "");

            webTokenRequest.Properties.Add("resource", resource);


                = await WebAuthenticationCoreManager.RequestTokenAsync(webTokenRequest);

            if (webTokenRequestResult.ResponseStatus == WebTokenRequestStatus.Success)


                WebTokenResponse webTokenResponse = webTokenRequestResult.ResponseData[0];

                token = webTokenResponse.Token;




        return token;


    Let’s look at some of the highlights.  First we get our aadAccountProvider by calling WebAuthenticationCoreManager.FindAccountProviderAync().  This is the call into the Web Account Manager that I have been talking about.  You always pass it the (at least for this exercise).  We then need a WebTokenRequest.  This is where the ClientId of the application comes in.  Remember Visual Studio created this for us when it registered our app with Azure Active Directory.  The resource property is where we pass in the Uri to the service we want to access.  On our first call this will be the Uri of the discovery service.  On subsequent calls it will be the Uri we get for accessing things in SharePoint.

    Next, the method tries to silently authenticate the user.  This method would return with a Success result if the user has authenticated inside this application before. 

    WebTokenRequestResult webTokenRequestResult

        = await WebAuthenticationCoreManager.GetTokenSilentlyAsync(webTokenRequest);

    If the user already has a token, then we are done.  Otherwise user interaction is required. The RequestTokenAsync call on the WebAuthenticationCoreManager is what actually prompts the user for credentials.  Ultimately, this method returns a token for the resource you asked for.  When we run the application and click the Play button, it will prompt us for credentials.


    After you authenticate, you will get prompted to grant permissions to the app.  These will match the ones we specified earlier when adding the Connected Service.


    Calling the Discovery Service

    We’re going to call the Discovery Service to determine the ServiceEndpointUri for the RootSite capability. We’ll do this in a new method called GetSharePointServiceEndpointUri. We do this by accessing for an access token from our GetAccessTokenForResource passing the URL

    string accessToken

        = await GetAccessTokenForResource("");

    DiscoveryClient discoveryClient = new DiscoveryClient(() =>


        return accessToken;


    After we get a DiscoveryClient, we can call DiscoverCapabilityAsync.  This method also stores the access token used for accessing resources in the variable sharePointAccessToken.  It also stores the ServiceEndpointUri in the variable sharePointServiceEndpointUri.  We’ll use this URI for the subsequent REST calls.

    CapabilityDiscoveryResult result = await discoveryClient.DiscoverCapabilityAsync("RootSite");

    sharePointAccessToken = await GetAccessTokenForResource(result.ServiceResourceId);

    sharePointServiceEndpointUri = result.ServiceEndpointUri.ToString();


    return sharePointServiceEndpointUri;

    Remember, the full source code is available in GitHub.

    Get the Video Portal Hub URL

    If you have worked with Office 365, you probably know that it’s pretty easy to guess what the URL to the Video Portal Hub will be (the hub is the root site collection of the Video Portal).  Since we want to do things right though, we’ll go through the Video Service’s Discover endpoint to determine the URL.  In fact, once we construct these next two methods, the rest will be very similar.

    We’ll create a new method called GetVideoPortalHubUrl.  The first thing we’ll do is call GetSharePointServiceEndpointUri to authenticate and return us the endpoint for working with SharePoint.  We append VideoService.Discover to the Uri returned by our method.  How did we know this was the URL, it’s in the Video REST API reference (still in preview).

    var requestUrl = String.Format("{0}/VideoService.Discover", await GetSharePointServiceEndpointUri());

    We then, create a function that returns an HttpRequestMessage using the requestUrl.

    Func<HttpRequestMessage> requestCreator = () =>


        HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, requestUrl);

        request.Headers.Add("Accept", "application/json;odata=verbose");

        return request;


    Then, we create a new httpClient and pass it, our access token, the HttpClient, and the RequestMessage to a custom method we will build named SendRequestAsync.   This piece is based upon other’s code so I didn’t really tweak it since it works.  After our call to that method, we’ll deserialize the jSon that comes back.  I borrowed several of AC’s helper classes from his MVP example here.

    Let’s take a look at SendRequestAsync.  We’ll start with a using blog to get our HttpRequestMessage

    using (var request = requestCreator.Invoke())

    We’ll then need to add the necessary headers.  First, we add an AuthenticationHeaderValue with the sharePointAccessToken we got earlier.  We also add another header, X-ClientService-ClientTag.  I don’t know if this is required but Chaks included it in his examples.  I updated the version to match the current version reported in the Connected Service wizard.

    request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", accessToken);

    request.Headers.Add("X-ClientService-ClientTag", "Office 365 API Tools 1.5");

    Then we invoke the request using httpClient.

    var response = await httpClient.SendAsync(request);

    The method then returns the response assuming it’s successful.  The example that Chaks provided will actually make a second attempt to call the service if the user isn’t authorized.  You can optionally include that if you like.

    Now going back to our GetVideoHubUrl method after we successfully got a response back.  We now need to parse the JSON that came back into a usable object.  This is where Newtonsoft.Json comes in.  We use the models that AC provided in the MVC project. In this case we serialize the responseString into an object of VideoServiceDiscovery.  Now, we can get the VideoPortalUrl for our subsequent REST calls.

    string responseString = await response.Content.ReadAsStringAsync();

    var jsonResponse = JsonConvert.DeserializeObject<VideoServiceDiscovery>(responseString);

    videoPortalUrl = jsonResponse.Data.VideoPortalUrl;

    Get the video portal channels

    Now that we have our Video Portal Hub URL, we can retrieve the list of channels.  The Video REST API Reference tells us we need to append /_api/VideoService/Channels to the Video Portal Hub URL which we just retrieved.  We’ll create a new method called GetVideoChannels which will look similar to GetVideoPortalHubUrl expect the requestUrl used and the JSON serialization.

    var requestUrl = string.Format("{0}/_api/VideoService/Channels", videoPortalUrl);

    Since we return multiple channels, we loop through the results and add new VideoChannel objects to the list. 

    string responseString = await response.Content.ReadAsStringAsync();

    var jsonResponse = JsonConvert.DeserializeObject<VideoChannelCollection>(responseString);


    var channels = new List<VideoChannel>();


    foreach (var videoChannel in jsonResponse.Data.Results)


        var channel = new VideoChannel


            Id = videoChannel.Id,

            HtmlColor = videoChannel.TileHtmlColor,

            Title = videoChannel.Title,

            Description = videoChannel.Description,

            ServerRelativeUrl = videoChannel.ServerRelativeUrl




    At this point, you have enough data you can start doing some data binding.  As I mentioned earlier, I am skipping that since data binding in a Universal Windows App for Windows 10 is quite similar to what was there in Windows 8.1  For our example, I am simply going to pass the Channel Id into our next method to retrieve a list of videos.

    Getting the list of videos in a channel

    Now that we have a list of Channels, we can make another Video REST API call to get a list of videos in our new method GetVideos.  We build our requestUrl by appending /_api/VideoService/Channels(‘<channelid>’)/Videos.  Remember we got the Channel Id from our last API call.  Now this simply returns the metadata about the video.  After this, we still have one more call to get the Playback URL.

    var requestUrl = string.Format("{0}/_api/VideoService/Channels('{1}')/Videos", videoPortalHubUrl, channelId);

    The rest of the method is similar to GetVideoChannels

    Get the video playback URL

    For the purpose of our example, we’re also going to simply play the first Video in the Channel.  To get the Playback URL, we’ll need the Video Id as well as the Channel Id we retrieved earlier.  When it comes to streaming, we have a choice of either Smooth Streaming or HLS Streaming by specifying the streamingFormatType parameter on the Video REST API call.  HLS streaming requires one fewer API call so we are going to use that for the example today.  You may want to use Smooth Streaming for your application.  To use HLS Streaming, specify a value of 0.

    var requestUrl = string.Format("{0}/_api/VideoService/Channels('{1}')/Videos('{2}')/GetPlaybackUrl('{3}')",

        new string[] { videoPortalHubUrl, channelId, videoId, streamingFormatType.ToString() });

    The REST call will return a playback URL which we can then assign to the MediaElement

    Playing the video

    On the click handler of the Play button we added in the beginning, we make our calls into the VideoRepository class.  Once we get the Playback URL we binding it to the MediaElement and the video will start playing.  I also set a few parameters to turn on the transport controls and enable full screen mode.  You can set those as you like.

    private async void button_Click(object sender, RoutedEventArgs e)


        VideoRepository videoRepository = new VideoRepository();

        string videoPortalUrl = await videoRepository.GetVideoPortalHubUrl();

        var videoChannels = await videoRepository.GetVideoChannels();

        string channelId = videoChannels[0].Id;

        var videos = await videoRepository.GetVideos(channelId);

        string videoId = videos[0].VideoId;

        var videoPlayback = await videoRepository.GetVideoPlayback(channelId, videoId, 0);


        mediaElement.Volume = 0;

        mediaElement.AreTransportControlsEnabled = true;

        mediaElement.IsFullWindow = true;

        mediaElement.Source = new Uri(videoPlayback.Value);


    There you have it a (somewhat) quick and dirty way to play a video from the Office 365 Video Portal in Windows 10.  This should work on Windows 10 Mobile as well.  Here is what our app looks like when playing a video.


    Source Code

    As promised the source code is available on GitHub.  I may extend it in the future to include all of the data binding and improve the look.  Feel free to contribute to the repo as well.  If you make use of any of the code, you’ll want to add error handling of course.

    Feel free to ask me any questions by leaving a comment.

    Source code on GitHub

    Follow me on twitter: @coreyroth

More Posts « Previous page - Next page »
2015 dotnetmafia.
Powered by Community Server (Non-Commercial Edition), by Telligent Systems