Moving house on the internet and SEO

When I left Microsoft a couple of years back I also decided to move house on the internet. I started looselytyped.net and made it my new location on the tubez (aka the internet).  I didn’t technical “move” house I guess as you can still find my old blog.

In fact that is the problem!

If you search for “Chris Johnson SharePoint” my new site is on the first page of Bing, but not at the top & it’s the 2nd result on Google.  However, my old blog is in the #1 spot in both search engines. Grrrr.

I didn’t want to delete my old blog, that seems like overkill and old posts still get a lot of traffic. There are 3 or 4 old posts that are knocking past 150,000 views which must mean someone finds them helpful. But given I have moved house I want people to find me at my new location. This issue has been confirmed by a couple of others who tried to find me and didn’t have good success.  Classic SEO problem some would say.

Unfortunately, I cant seem to “relocate” my old content to my new blog & put in a good redirection process that would be search engine friendly. My old blog is hosted with Microsoft and there are limited controls as far as I can tell.  So I think leaving it in place is my only option.

I have been making a concerted effort to try and get my new location indexed and ranking higher so people can find me. It has been working slowly.

One tool I heard about in the Startups for the Rest of us podcast was HitTail. (www.HitTail.com).  I also saw that Andrew Connell was giving it a try too, so thought I would give it a shot.

It’s a very nifty tool.  You sign up (free trial) and then add a wee bit of JavaScript to your page and you are all set. 

What does HitTail do?

In a nutshell HitTail shows you what keywords (from your existing organic search traffic) you should target when writing content for your site that will increase your ranking.

Now this seems a bit counter intuitive at first, at least it did to me. Why would I write more content about things people are already coming to my site for that they found from a search? 

Well, the suggestions it makes are based on the fact that people are finding your site but that you could be doing a much better job of ranking higher for those terms. So it helps guide you on what to write about that will bump you up the ranking.

So I am trying it out.  I am reviewing the suggestions it makes and matching that up with topics I a) want to write about, b) have something useful to say about, and c) that I know people will find interesting.

At this stage it’s a trial to see how useful I find the suggestions, but I thought it was a handy tool to help with my internet home relocation.

If any of my readers are SEO experts I would love any other suggestions you have around my predicament.  Suggestions welcome! Get in touch on Twitter @LoungeFlyZ

-CJ

More Office 365 storage improvements

Back in March I posted about Office 365 Exchange Online’s online archive features and how great they are for keeping up to 100GB of mail.  Take a read of that here: Exchange online archive awesomeness

This today and earlier this week we saw a couple of announcements from Microsoft around storage and capacity improvements in a couple of key Office 365 areas.

1. SkyDrive Pro.  This gets a bump to 25GB of storage by default. Previously it was 7GB per user.  This is pretty nice for those like me who use SkyDrive Pro to stash their files.  Also you are not limited to 25GB, your Office 365 administrator can bump it up to 100GB too.

2. Exchange Online. Microsoft just bumped the capacity of peoples inboxes to 50GB max.  Time to get my inbox size increased!

It has been really interesting watching the gradual increase in storage quotas along with price drops in the Office 365 service. Back in April 2013 there was a significant set of changes in the SharePoint Online service that increased the tenant storage limits to 25TB! What was more impressive and in my opinion really made SharePoint online A LOT more affordable was the price drop that went with it. 92% reduction from $2.50 USD to $0.20 per GB.  That meant customers with a lot of data who were considering moving from SharePoint on-premises are far less likely to get hung up on price than they previously were.  With a number of pain points likely to make a move like that hard, price was a no brainer to get rid of.

I guess this really shouldn’t be much of a surprise given the cost of storage these days. But these improvements don’t primarily come from that ever decreasing costs of storage drop.  They primarily come from Microsoft getting better and better are running the Office 365 service and bringing their internal COGs (Cost of Goods) down and passing along those savings to customers.

This is great news for both current users of the Office 365 service and future customers alike.

-CJ

SharePoint Saturday Redmond coming on Sept 21st

One of great joys of the SharePoint community is the momentum of the SharePoint Saturday community led events. These are run by the community for the community and are one day events … on a (you guessed it) Saturday.  People take time our of their personal schedules to attend and hear from speakers on all sorts of topics.

SharePoint Saturday Redmond is coming up again this year on the 21st of September.

This year Provoke Solutions are proud to be a Gold sponsor of the event and will have a booth on site Come and talk with us about your SharePoint needs!

This year I have the pleasure of speaking at the event also:

Designing and Building solutions with the future in mind
Want to know how to ensure you don’t develop your way into a dead end? The cat is out of the bag on the new Apps development model in SharePoint 2013. Come and learn about architecting and designing your SharePoint development projects today in order to be set up for taking advantage of this new model in the future. We will cover all the basics on the new model to bring you up to speed, plus how they translate to decisions you need to make today in your projects to ensure you don’t dig a hole that’s hard to get out of in the future.

To learn more about the event and to registers (its FREE!) head over to:  http://spsevents.org/city/Redmond/SPSRed2013

-CJ

Sharepoint Provider Hosted Apps in Azure monitoring tip

One of the tips I gave to during my session at TechEd North America this year was about using SignalR in your SharePoint provided hosted applications in Azure.

One of the pain points for developers and people creating provider hosted apps is monitoring them when they are running in the cloud. This might be just to see what is happening in them, or it might be to assist with debugging an issue or bug.

SignalR has helped me A LOT with this.  It’s a super simple to use real time messaging framework. In a nutshell it’s a set of libraries that let you send and receive messages in code, be that in JavaScript or .Net code.

So how do I use it in SharePoint provider hosted apps in Azure to help me monitor and debug?

A SharePoint Provider Hosted App is essentially a web site that provides parts of your app that surface in SharePoint through App Parts or App Pages etc… It’s a set of pages that can contain code behind them as any regular site does.  It’s THAT code that runs that I typically want to monitor while its running in the Azure (or anywhere for that matter).

So how does this work with SignalR?

SignalR has the concept of Hubs that clients “subscribe” to and producers of messages “Publish” to.  In the diagram below App Pages code publish or produce messages (such as “there was a problem doing XYZ”) and consumers listen to a Hub and receive messages when they are published.

image

In the example I gave at TechEd I showed a SharePoint Provider Hosted App deployed in Azure that Published messages whenever anyone hit a page in my app.  I also created a “Monitor.aspx” page that listened to the Hub for those messages from JavaScript and simply wrote them to the page in real-time.

How do you get this working? It’s pretty easy.

Part 1: Setting up a Hub and publishing messages

First add SignalR to your SharePoint Provider Hosted app project from Nuget. Click the image below for a bigger version showing the libraries to add.

image

Then in your Global.asax.cs you need to add a Application_OnStart like this.  It registers SignalR and maps the hub urls correctly.

protected void Application_Start(object sender, EventArgs e)
{
    // Register the default hubs route: ~/signalr
    RouteTable.Routes.MapHubs();
}

Note: You might not have a Global.asax file in which case you will need to add one to your project.

Then you need to create a Hub to publish messages to and receive them from.  You do this with a new class that inherts from Hub like this:

public class DebugMonitor : Hub
{
    public void Send(string message)
    {
        Clients.All.addMessage(message);
    }
}

This provides a single method called Send that any code in your SharePoint Provider Hosted app can call when it wants to send a message. I wrapped this code up in a short helper class called TraceCaster like this:

public class TraceCaster
    {
        private static IHubContext context = GlobalHost.ConnectionManager.GetHubContext<DebugMonitor>();
        public static void Cast(string message)
        {
            context.Clients.All.addMessage(message);
        }
    }

This gets a reference to the Hub called “context” and then uses that in the Cast method to publish the message.  In code i can then send a message by calling:

TraceCaster.Cast(“Hello World!”);

That is all there is to publishing/sending a simple message to your Hub. 

Now the fun part … receiving them 🙂

Part 2: Listening for messages

In my app I created a new page called Monitor.aspx. It has no code behind, just client side JavaScript. In that code it first references some JS script files: JQuery, SignalR and then the generic Hubs endpoint that SignalR listens on.

<script src=”/Scripts/jquery-1.7.1.min.js” type=”text/javascript”></script>
<script src=”/Scripts/jquery.signalR-1.1.1.min.js” type=”text/javascript”></script>
<script src=”/signalr/hubs” type=”text/javascript”></script>

When the page loads you want some JavaScript that starts listening to the Hub registers a function “addMessage” that is called when the message is sent from the server.

$(function () {
    // Proxy created on the fly         
    var chat = $.connection.debugMonitor;

    // Declare a function on the chat hub so the server can invoke it         
    chat.client.addMessage = function (message) {

        var now = new Date();
        var dtstr = now.format(“isoDateTime”);

        $(‘#messages’).append(‘[‘ + dtstr + ‘] – ‘ + message + ‘<br/>’);
    };

    // Start the connection
    $.connection.hub.start().done(function () {
        $(“#broadcast”).click(function () {
            // Call the chat method on the server
            chat.server.send($(‘#msg’).val());
        });
    });
});

This code uses the connection.hub.start() function to start listening to messages from Hub.  When a message is sent the addMessage function is fired and we can do whatever we like with it. In this case it simply adds it to an element on the page.

All going well when you are running your app you will be able to open up Monitor.aspx and see messages like this flowing:

image

If you don’t see messages flowing you probably have a setup problem with SignalR.  The most common thing I found when setting this up was the client not being able to correctly reference the SignalR JS or Hub on the server.  Use the developer tools in IE or Chrome (or Fiddler) to check that the calls being made to the server are working correctly (see below for what working should look like):

image

If you are sitting there thinking “What if I am not listening for messages? What happens to them?” I hear you say!  Well, unless someone is listening for the messages they go away. They are not stored. This is a real-time monitoring solution. Think of it as a window into listening what’s going on in your SharePoint Provider Hosted app.

There are client libraries for .Net, JS, iOS, Android too. So you can publish and listen for messages on all sorts of platforms.  Another application i have used this on is for simple real time communication between Web Roles in Azure and Web Sites in Azure.  SignalR can use the Azure Service Bus to assist with this and its pretty simple to set up.

Summary

I’m an developer from way back when debugging meant printf.  Call me ancient but I like being able to see what is going on in my code in real time. It just gives me a level of confidence that things are working the way they should.

SignalR coupled with SharePoint Provided Hosted Apps in Azure are a great combination.  It doesn’t provide a solution for long term application logging, but it does provide a great little realtime windows into your app that I personally love.

If you want to learn more about SignalR then I suggest you take a look at http://www.asp.net/signalr where you will find documentation and videos on other uses for SignalR.  It’s very cool.

Do I use it in production? You bet!  I use it in the backend of my Windows Phone and Winodows 8 application called My Trips as well as in SharePoint Provider Hosted Apps in Azure. Here is a screen shot from the My Trips monitoring page, I can watch activity for various devices registering with my service etc…

image

Happy Apping…

-CJ

SkyDrive Pro naming quandary and the mess of MS file sync products

Today Mary Jo Foley wrote a piece about Microsoft losing the fight with British Sky Broadcasting Group over the name SkyDrive and in particular (as I understand it) in the “Sky” part of that.

Microsoft to rebrand SkyDrive after losing trademark skirmish – Mary Jo Foley – ZD Net

In that article she also mentions that this will likely effect the SkyDrive Pro product naming also.  Which perked my ears up given its SharePoint linkage.

I never liked the SkyDrive Pro name.  Why?  Primarily because it makes it sound like it has something to do with SkyDrive and cloud storage.  When in reality you can use SkyDrive Pro to sync files from your on-prem SharePoint system … which doesn’t feel particularly “Sky” or “Cloudy” to me!  It just confuses everyone I speak with.

I can see what Microsoft were trying to achieve with regards to Office 365 and syncing files … it makes a little more sense in that regard.  However, I always had a problem with it “hooking” on to the consumer brand of SkyDrive. They are 100% totally separate products, no technical similarities etc… The only similarity is that they sync files to your PC from somewhere.

I actually think having to rename SkyDrive is a blessing in disguise for Microsoft.

Now is the perfect time to get it right and clear up all the confusion!

To make naming and technology matters worse Windows Server 2012 R2 is introducing a feature called “Work Folders” which enables access to on-prem file shares remotely. The reason being that 1000s of customers have files in shares and will continue to do so & therefore it would be good to have access to those remotely.  You can read about Work Folders here: Introducing Work Folders on Windows Server 2012 R2  It’s actually a pretty nifty solution.

SO!  Now we currently have:

  • SkyDrive Pro – which allows you to sync files from Office 365 and from SharePoint on-prem.
  • Work Folders – which allows you to access files from file shares on-prem.

Do you see where I am going with this?

Wouldn’t it be sensible for Microsoft to release ONE Enterprise grade file sync/access tool that let you access files from Office 365, SharePoint on-prem and file shares on-prem?

I call it “Files”.  Users would never see that name however as it would show up branded/renamed in Explorer, iOS, Windows 8 and Windows Phone using the organizations logo and name e.g. “Contoso Files”.

Currently here is what i see in Explorer:

image

Here is what I want to see in Explorer:

image

In that one location you would see a list of “folders” that either belong in SharePoint on-prem, Office 365 or Work Folders (file shares on prem).

image

Is this really too much to ask for?

I don’t think so 🙂

With branding up in the air it would be a good time to strike and fix the mess up. However, I suspect it will take a while for this to happen and we are likely to see a name change first in the coming weeks/month (after all its a Select All replace right?! :)) and a consolidated product down the line (all fingers crossed).

What do you think should happen? or what would you like to see from Microsoft to make your file sync and management simpler?

-CJ.

SPChat transcript: App Model with Chris Johnson

On Wednesday last week I had the pleasure of participating in my first SPChat over on the SharePoint-Community.net site.  These are online Q&A based chats where an “expert” is invited and questions are fielded in real time from the live audience.  Each chat has a topic, but anything goes within that topic.

My topic was about the new SharePoint application model for developers is SharePoint 2013 and Office 365. It was really fun fielding questions, but i also felt like my typing was slow and it was hard for me to temper my answers with knowing that i wanted to get as many questions answered as possible.

Questions ranged from the app store submission process to app domain isolation, to tools i use in app development and the financial side of apps.  All interesting topics.

You can read a full transcript online here:

http://sharepoint-community.net/profiles/blogs/spchat-transcript-app-model-with-chris-johnson

I hope to do another one some day!

-CJ

The importance of APIs

This week was interesting for owners of Nissan Leaf all electric vehicles.  Nissan has apps for iPhone, Android and Blackberry that let owners see the battery level of their car, turn on and off charging and set the climate control inside their car.  For Windows Phone users (and any other platform like Windows 8 and any non-officially supported platform) 3rd party apps like LEAF Commander were an important to us.

Then a week or so ago Nissan said the following about the ability to use these apps:

“for the privacy and security of our owners, Nissan will be removing that capability” [op: ability to use 3rd party apps]

This of course went down like a cup of cold puke with owners. Especially so in the Seattle, Bellevue and Redmond area where Microsoft is king and Windows Phone reins supreme.  LEAF Commander is being used by > 2000 active Leaf owners worldwide. So the network of Leaf owners erupted and i am sure the poor people at the Nissan customer support desk got an earful from more than a few disgruntled people.

It must have got their attention as today I got an email back from Nissan saying:

Based on initial customer feedback, we understand how important this connectivity is to LEAF drivers, and we will delay taking this action while we further study other potential solutions and explore ways to keep customer data secure.

It’s a shame it’s just a delay … but hopefully this results in a better thought out decision.

So what does this have to do with APIs? 

As I understand it there isn’t a well documented and thought out API for talking to the Nissan system.  These apps have users type their usernames and passwords into the apps themselves & i am guessing that is what Nissan were unhappy about.  The reason being that if you give your username and password to an App then it could send it or save it and use it for whatever it liked.  There would be no way to tell if it was a real user or an app that was calling the Nissan service.  This would be like giving someone your username and password. 

This is the reason why may services on the internet like Twitter, Facebook, Flickr, TripIt and Office 365 included use OAuth to broker allowing/trusting an app to make calls to a service on a users behalf.  The user never gives the app their username and password, they just authorize an app and tell the services that it’s ok for that app to do certain things on their behalf.  When they no longer are ok with that they simply revoke that apps access. In Facebook that screen looks like this:

image

I would dearly love to see Nissan open up an API that any app developer could use and that was authorized with something like OAuth.  This would left other developers build on their APIs, offer their customers great app experiences and most importantly build customer satisfaction.  I use a lot of 3rd party app for services like Twitter because i think they are better and offer me more than the official ones.  The same goes for the Nissan apps (CarWings is the apps name if you want to check it out).  They are ok … but not GREAT! I am sure other developers could do better & add other things alongside the basic stuff that Nissan have no interest in.

Take my experience with Trip It for example.  I make an app called My Trips for Windows Phone and Windows 8 that gives people a better TripIt experience than the official app on Windows Phone and an alternative to using the browser on Windows 8 (offline etc..)  TripIt have embraced others building apps on their service. There are loads of apps out there that integrate with TripIt, sync trips back and forth, integrate between systems etc… Its a pretty robust ecosystem. And they get a lot of credit for allowing it!

In summary I think this short story novel helps illustrate the importance of having APIs. One company went about poorly supporting their customers (via a no API option) and others are embracing it and thriving (TripIt, 365 etc…). I would dearly like to see Nissan learn from this and offer a secure and robust API that we can build great experiences over.  Hopefully they wont just delay the decision, they will hopefully build out a real API and method of accessing it!

In this world of connected devices and services I can imagine this only becoming more important and one where companies that get it right thrive and those that don’t fail.

OK OK I secretly want to be able to control my car from my watch … come on MS and sell a surface watch and Nissan with a decent API!

image

DIWUG article on SharePoint and Yammer

This month I was honored to be able to contribute to the Dutch Information Worker User Group (DIWUG) e-magazine.

The article is titled “SharePoint, Yammer and the social landscape” and basically covers the lay of the social landscape as it currently stands & what organizations can do now to help with the rift we currently see in SharePoint and Yammer social features.

The magazine is free and available to download here:
http://www.diwug.nl/e-magazines/Pages/default.aspx

6-28-2013 12-31-29

Thanks,

-CJ

SharePoint app tools in Visual Studio 2013 preview, the new SharePointContext helper!

imageToday at the build conference  the new preview of Visual Studio 2013 was released.  Along with it there were some nice advances in the tools for building SharePoint and Office applications also.

You can read the full post about the tools here.

The most interesting of these is the new out of the box project template option for creating a ASP.Net MVC based application.  It asks you during the project creation wizard for provider hosted and autohosted apps.

image

When you create a totally out of the box app using the wizard and picking this setting you get a very familiar looking app if you have ever create an ASP.Net MVC app before.

image

Now that is pretty handy being able to create these via the wizard, but what is even cooler is a particular improvement to some of the additional helper classes you get in the new template.

Namely the SharePointContext class.

This is a new class that is added in the new out of the box template is helps you manage your SharePoint context across page requests.

image

When SharePoint “starts” an app i.e. when a user launches an app, SharePoint packs up some information about that user and passes it along to the app as a POST parameter. This is called the ContextToken and it contains OAuth tokens/information that you need in order to make calls back to SharePoint. Now the trick is that SharePoint passes it to your app when it launches and then it is up to you to do something like cache it so that in subsequent page requests your app has that context and can reuse it.  The basic auth token in it is good for 12 hours & it also contains a refresh token that can be used to get new auth tokens for up to 6 months.

The first question people that are new to SharePoint app development ask me is “my app breaks on the second page request, why?” … it is usually because they don’t realize its totally up to them to cache these very important tokens.

The new MVC template in the VS 2013 preview includes the SharePointContext class that helps with this whole process & gives you an out of the box experience that doesn’t totally suck.

In a new project you will see a line of code that looks like this:

var spContext = SharePointContextProvider.Current.GetSharePointContext(HttpContext);

This is creating a new context object & initializing it using all that good information passed as a POST parameter and some querystring parameters too.

Then you can use that context to get Client Side Object Model (CSOM) instances really easily like this:

using (var clientContext = spContext.CreateUserClientContextForSPHost())
{
    if (clientContext != null)
    {
        spUser = clientContext.Web.CurrentUser;

        clientContext.Load(spUser, user => user.Title);

        clientContext.ExecuteQuery();

        ViewBag.Message = spUser.Title;
    }
}

Now this is not all that different from what you used to do on the first request to your SharePoint app.  TokenHelper helped you create a CSOM instance from the request details. However the SharePointContext goes further.

When you make the GetSharePointContext(HttpContext) call the class checks the ASP.Net Session state for an existing context. If it doesn’t find one then it creates a new one based on the information passed and then stashes it in Session state for subsequent requests.

Then on subsequent requests, when SP hasn’t passed any new tokens, the GetSharePointContext(HttpContext) will return you back the context from Session state. It will also deal with when and if tokens expire.

(Update 6/28 – added this paragraph)
Additionally the provided [SharePointContextFilter] attribute on controller actions ensures a context is passed from SharePoint. If not it will redirect the user back to SharePoint to obtain one.  Why is this important?  Well, if someone bookmarks your app then when they use that bookmark the context wont be passed & in that case you need to bounce them via SharePoint to go and get one for the first request.  The [SharePointContextFilter] automates that for you.  This is only available in the MVC project however. Very handy indeed not having to wire up this flow yourself!

You don’t need to worry about writing any of this however, as it is all done for you in the helper class.  Go take a look in the GetSharePointContext(HttpContextBase httpContext) method if you are interested in seeing how it works & more importantly run your app and set through the code so you can see how it runs and works.

Once you have a SharePointContext object you can take a look in it and find the CacheKey (used for uniquely identifying this in the cache), ContextToken (for making calls back to SP), RefreshToken (for getting more access tokens when they expire) and other properties it stashes for you like the site language, SP product version etc…

image

The way in which it is structured is also very friendly for replacing if you want to roll a different caching technique.  You could replace the existing implementation or create your own SharePointContextProvider class that managed the caching.

The library comes with implementations for both on-prem and cloud scenarios:

  • SharePointAcsContextProvider – Office 365
  • SharePointHighTrustContextProvider – On-Prem apps using the high trust S2S auth model

Summary

This is simple but very timely additional to the out of the box templates in VS!

Just a few weeks ago at TechEd North America I did a tips and tricks session for app developers and Demo #2 was showing a simpler version of essentially the same thing.  The main difference in the helper class I showed in that demo was that it will work for ASP.Net forms apps as well as MVC (Update 6/27: The newly shipped helper does support ASP.Net Forms based apps too) … however it doesn’t deal with the high-trust S2S scenario for on-prem only apps.

Shout out to @chakkaradeep for the great work on the VS SharePoint tools (a topic near and dear) & for taking the time to watch my TechEd session and let me know they were releasing this helper today!

Update 6/27/2013: Mike Morton did a great session at build yesterday that walks through a whole lot of this. What it here: http://channel9.msdn.com/Events/Build/2013/3-319

To try this out for yourself you will need to get the VS 2013 preview bits: http://www.microsoft.com/visualstudio/eng/2013-downloads

You will also need a SharePoint site to try it out in and I recommend signing up for a trial Office 365 site here: http://msdn.microsoft.com/en-US/library/office/apps/fp179924

Thanks!

-CJ