Active Directory, Azure Active Directory and Office 365

Apologies for the absence, I am engaged in a Masters degree as well as my SharePoint Career and being a good mummy! The pressure is on! And I’ve got another Nuts challenge 1st of March. I now realise why I was so happy to be done with my first degree….The requirement for prefect references has come along way since 2006!

I have a new client who has a requirement to move their current Java based intranet to SharePoint online, specifically Office 365.  They have Active Directory (on premise) already in place and do not want to change this as they have other application which are dependant on it. They also have Azure Active Directory (AAD) in place for other application so my solution will need to include these elements. My first question was why they needed AAD as Office 365 can work perfectly without it. Answer is they have several other Cloud applications that require authentication through AAD so this must be used. I needed to do some research into using an on premise Active Directory with AAD and Office 365, to see if they could mesh well together. Someone told me that the primary AD controller would have to be moved to Azure.

My research suggests that the on premise AD works very well with AAD (Azure AD) managing credentials for O365. My question to them would be why they needed it in the first place as O365 and on premise AD will work perfectly well with O365.  To hazard a guess would  be because they have several different types of credentials per user, e.g. Windows Live Account as well as the organisational account. I have both of these for Uni, the windows live account allows me to log on to OWA and access library services. The Organisational account is for when I’m logged in to the online portal or the desktop anywhere application (replicating conditions as if I were on campus)

My understanding of it is as follows:

What Microsoft have done is taken Active Directory, a widely deployed, enterprise-grade identity management solution, and made it operate in the cloud as a multitenant service with Internet scale, high availability, and integrated disaster recovery. You are able to have your AD on premise for all domains. Then you can use Windows Azure Active Directory to give users single sign-on to Software as a Service (SaaS) applications, O365 included. Applications running in the cloud or on-premises can use Windows Azure Active Directory Access Control to let users log in using identities from Facebook, Google, Microsoft, and other identity providers. To support this, Microsoft makes it easy to “connect” Windows Azure Active Directory with an existing directory. At the technical level, organizations can enable identity federation and directory synchronization between an existing Active Directory deployment and Windows Azure Active Directory.

When an organization does this, its Active Directory is, in a sense, stretching over both an on-premises and a cloud deployment. The ability for Active Directory to operate across both on-premises and cloud deployments in a hybrid mode enables an organization to easily take advantage of new cloud-based platforms and SaaS applications, while all of its existing identity management processes and application integration can continue unaffected.

In addition, being able to operate in this hybrid mode is critical for some organizations because of business or regulatory requirements that mandate that certain critical information, such as passwords, be maintained in on-premises servers.

If you don’t have your own on premise Active Directory, and you sign up for Office 365, Microsoft automatically create a new Windows Azure Active Directory that is associated with the Office 365 account. No action is required on the part of the individual signing up. This is the most simple approach, involving on premise AD is a more hybrid deployment.

In the diagram above, an organization can federate Windows Server Active Directory with Windows Azure Active Directory to give its users single sign-on to SaaS applications. In this scenario, a user at organization B wishes to access a SaaS application. Before she does this, the organization’s directory administrators must establish a federation relationship with Windows Azure AD using AD FS, as the figure shows. Those admins must also configure data synchronization between the organization’s on-premises Windows Server AD and Windows Azure AD. This automatically copies user and group information from the on-premises directory to Windows Azure AD. Notice what this allows: In effect, the organization is extending its on-premises directory into the cloud. Combining Windows Server AD and Windows Azure AD in this way gives the organization a directory service that can be managed as a single entity, while still having a footprint both on-premises and in the cloud.

To use Windows Azure AD, the user first logs in to her on-premises Active Directory domain as usual (step 1). When she tries to access the SaaS application (step 2), the federation process results in Windows Azure AD issuing her a token for this application (step 3). As before, this token contains information that identifies the user, and it’s digitally signed by Windows Azure AD. This token is then sent to the SaaS application (step 4), which validates the token’s signature and uses its contents (step 5). And is in the previous scenario, the SaaS application can use the Graph API to learn more about this user if necessary (step 6).

Today, Windows Azure AD isn’t a complete replacement for on-premises Windows Server AD. As already mentioned, the cloud directory has a much simpler schema, and it’s also missing things such as group policy, the ability to store information about machines, and support for LDAP. (In fact, a Windows machine can’t be configured to let users log in to it using nothing but Windows Azure AD – this isn’t a supported scenario.) Instead, the initial goals of Windows Azure AD include letting enterprise users access applications in the cloud without maintaining a separate login and freeing on-premises directory administrators from manually synchronizing their on-premises directory with every SaaS application their organization uses. Over time, however, expect this cloud directory service to address a wider range of scenarios.

Since 2011 Microsoft have enhanced Windows Azure Active Directory by adding new, Internet-focused connectivity, mobility, and collaboration capabilities that offer value to applications running anywhere and on any platform. This includes applications running on mobile devices like iPhone, cloud platforms like Amazon Web Services, and technologies like Java. So it doesn’t only have to be Microsoft technologies it utilises, it can be multi-platform.

Azure Active Directory Access Control

Windows Azure Active Directory (WAAD/ AAD) can give an organization’s users single sign-on to multiple SaaS applications, for example. But identity technologies in the cloud can also be used in other ways.

Suppose, for instance, that an application wishes to let its users log in using tokens issued by multiple identity providers (IdPs). Lots of different identity providers exist today, including Facebook, Google, Microsoft, and others, and applications frequently let users sign in using one of these identities. Why should an application bother to maintain its own list of users and passwords when it can instead rely on identities that already exist? Accepting existing identities makes life simpler both for users, who have one less username and password to remember, and for the people who create the application, who no longer need to maintain their own lists of usernames and passwords.

But while every identity provider issues some kind of token, those tokens aren’t standard – each IdP has its own format. Furthermore, the information in those tokens also isn’t standard. An application that wishes to accept tokens issued by, say, Facebook, Google, and Microsoft is faced with the challenge of writing unique code to handle each of these different formats.

But why do this? Why not instead create an intermediary that can generate a single token format with a common representation of identity information? This approach would make life simpler for the developers who create applications, since they now need to handle only one kind of token. Windows Azure Active Directory Access Control does exactly this, providing an intermediary in the cloud for working with diverse tokens.

The image above shows how Windows Azure Active Directory Access Control makes it easier for applications to accept identity tokens issued by different identity providers.The process begins when a user attempts to access the application from a browser. The application redirects her to an IdP of her choice (and that the application also trusts). She authenticates herself to this IdP, such as by entering a username and password (step 1), and the IdP returns a token containing information about her (step 2).As the image shows, Access Control supports a range of different cloud-based IdPs, including accounts created by Google, Yahoo, Facebook, Microsoft (formerly known as Windows Live ID), and any OpenID provider. It also supports identifies created using Windows Azure Active Directory and, through federation with AD FS, Windows Server Active Directory. The goal is to cover the most commonly used identities today, whether they’re issued by IdPs in the cloud or on-premises.

Once the user’s browser has an IdP token from her chosen IdP, it sends this token to Access Control (step 3). Access Control validates the token, making sure that it really was issued by this IdP, then creates a new token according to the rules that have been defined for this application. Like Windows Azure Active Directory, Access Control is a multi-tenant service, but the tenants are applications rather than customer organizations. Each application can get its own namespace, as the figure shows, and can define various rules about authorization and more.These rules let each application’s administrator define how tokens from various IdPs should be transformed into an Access Control token. For example, if different IdPs use different types for representing usernames, Access Control rules can transform all of these into a common username type. Access Control then sends this new token back to the browser (step 4), which submits it to the application (step 5). Once it has the Access Control token, the application verifies that this token really was issued by Access Control, then uses the information it contains (step 6). While this process might seem a little complicated, it actually makes life significantly simpler for the creator of the application. Rather than handle diverse tokens containing different information, the application can accept identities issued by multiple identity providers while still receiving only a single token with familiar information. Also, rather than require each application to be configured to trust various IdPs, these trust relationships are instead maintained by Access Control – an application need only trust it. It’s worth pointing out that nothing about Access Control is tied to Windows – it could just as well be used by a Linux application that accepted only Google and Facebook identities. And even though Access Control is part of the Windows Azure Active Directory family, you can think of it as an entirely distinct service from what was described in the previous section. While both technologies work with identity, they address quite different problems (although Microsoft has said that it expects to integrate the two at some point).Working with identity is important in nearly every application. The goal of Access Control is to make it easier for developers to create applications that accept identities from diverse identity providers. By putting this service in the cloud, Microsoft has made it available to any application running on any platform.

There is no requirement for a Primary AD controller to be moved to Azure, you just ‘connect’ and sync up.

References

http://en.share-gate.com/blog/migrate-to-office-365-configure-sharepoint-to-use-active-directory

http://www.windowsazure.com/en-us/documentation/articles/fundamentals-identity/

http://blogs.msdn.com/b/windowsazure/archive/2012/07/12/announcing-the-developer-preview-of-windows-azure-active-directory.aspx

http://blogs.msdn.com/b/windowsazure/archive/2012/05/23/reimagining-active-directory-for-the-social-enterprise-part-1.aspx

http://redmondmag.com/articles/2013/07/01/cloud-identity.aspx

http://betanews.com/2013/07/12/active-directory-as-a-service-azure-intune-hinting-at-a-cloud-hosted-ad-future/

Setting it up involves running a few PowerShell commands to set up a trust between ADFS and AAD. SSO needs to be implemented and the domain must be a top level domain. Full steps here http://technet.microsoft.com/en-us/library/jj205461

This is the only recommended implemented approach I have found for this hybrid deployment. If anyone can highlight other approaches I would be most interested to know.

Over and Out

Advertisements

Changing the Distributed Cache Service from One Server to another

Not gonna make this a long one because things are really busy. If you are getting to this post you should have an idea what the Distributed cache service is all about, if not here is a summary…

The Distributed Cache service is either required by or improves performance of the following features:

  • Authentication
  • Newsfeeds
  • OneNote client access
  • Security Trimming
  • Page load performance

For an overview of the Distributed Cache service, seehttp://technet.microsoft.com/library/jj219700(office.15).aspx or in plain English

http://blog.idera.com/sharepoint/the-five-minute-cheat-sheet-on-sharepoint-2013s-distributed-cache-service/#comment-2533

I have some SharePoint 2013 environments I use for demos and the likes. one such environment has 4 servers, on Domain controller, a VPN server for networking, another for SharePoint 2013 as a web front end and the last one a SharePoint 2013 server dedicated specifically for managing the Distributed cache service. Now I have inherited this particular set of environments from Microsoft with dummy content on so I don’t have to work too hard at developing features for demos and POCs. My lap top has a max 16GB RAM so running all 4 servers in my hyper v is very resource consuming so I needed to find a way of minimizing the RAM requirements so I could use the machines all together without compromising on performance.

Although not totally necessary to have the distributed cache on a dedicated server unless you have a large farm, I decided to remove the need for the dedicated Distributed cache server as the farm is for use of about 100 users. That would save me about 8gb RAM! So no need for a dedicated cache server. But I do need to preserve the data on there and somehow transfer it to somewhere else where there is a valid service for the distributed cache, available to consume the data, the other web front end perhaps? Correcto! And then remove the service from the dedicated server and put the beast to rest.

So in Lamen’s terms…

Web Front End with Distributed Cache Service (dedicated) = WFE-DC

Web Front End with The Rest Of the SharePoint Services available = WFE-ALL

Some constants –

WFE-ALL has App fabric installed, necessary for the Distributed Cache and firewall is deactivated for the domain or  allows inbound ICMPv4 traffic. If you need Windows Firewall, you can enable this in PowerShell with the Set-NetFirewallRule cmdlet. The name of the rule is “File and Printer SharePoint (Echo request – ICMPv4-In)”. Notice also that it doesn’t take a Boolean ($true), but rather the string “True” as an argument to the -Enabled parameter. Don’t forget to Import-Module NetSecurity first, though! See below…

1727.clip_image002_thumb_51E787A1

By now you’ve figured we need to use our new best friend Powershell to execute this master plan. Get used to Powershell if you wanna be a good developer/ architect, it can save you time and effort!

The Distributed Cache is one greedy kid, consuming a max of 10% total memory resource on any server it runs. So be aware of this when allocating memory to the server hosting the  service.

For whatever reason you need to move the Distributed Cache Service in a SharePoint environment,  here’s the master plan….

1.   Add the Distributed Cache Service to the WFE-ALL server to create a Distributed Cache Cluster (Fancy eh?). You need somewhere to move the cache data to when removing the service from the WFE-DC, to avoid data loss.

2.  ‘Gracefully’ shut down the Distributed Cache service on the WFE-DC. This will automatically move over all the data to the WFE-ALL server which is now part of the cluster. Its a long process, expect it to take around 15 mins or more depending on the amount of data in there.

3.  Them remove the service from the WFE-DC and it’s all done.

Now would you believe me if I told you this can be accomplished with 3 lines of Powershell??? Check this out…

1.  On WFE-ALL run the following:

Add-SPDistributedCacheServiceInstance

2.   On WFE-DC run the following:

Stop-SPDistributedCacheServiceInstance -Graceful
Remove-SPDistributedCacheServiceInstance

Wait 15 mins, pour a drink or make a coffee, whichever poison tickles you fancy and then check the service is operational in its new location. Also test microfeed update settings in newsfeeds, this will be a good indicator of its state.

Other stuff you might need to manage at a later date are changing the service account, change the memory allocation or repair the host:

http://technet.microsoft.com/en-us/library/jj219613.aspx

Pretty simple stuff peeps?

Peace, Love and SharePoint!

 

 

SharePoint 2013 Search not provisioning correctly

Imagine one fine sunny day you built a new SharePoint 2013 Search Application, for a new server or an old one, and the rain came in the form of an error accessing the application after Central Admin stated clearly that it was created successfully. Here’s what you would be experiencing:

After creating the Search Service Application, the following issues arise:

  1. The search topology displays the following message: “Unable to retrieve topology component health states. This may be because the admin component is not up and running.”
  2. The default content source “Local SharePoint Sites” is inconsistent. It doesn’t always appear after creation of Search, sometimes with start addresses of existing web apps listed already, other times not.
  3. Starting a full crawl results in stuck in ‘starting’.

So you can’t configure your search. In PowerShell, all search components are listed as available.

Event Viewer shows:

Application Server Administration job failed for service instance Microsoft.Office.Server.Search.Administration.SearchServiceInstance (cf15c8c7-1980-4391-bd97-17de75c4dd5d).

Reason: Failed to connect to system manager. SystemManagerLocations: http://sp2013/AdminComponent1/Management

Technical Support Details:

System.InvalidOperationException: Failed to connect to system manager. SystemManagerLocations: http://sp2013/AdminComponent1/Management—> System.ServiceModel.EndpointNotFoundException: There was no endpoint listening at http://sp2013/AdminComponent1/Management/NodeController that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details.

Server stack trace:

   at System.ServiceModel.Channels.ConnectionUpgradeHelper.DecodeFramingFault(ClientFramingDecoder decoder, IConnection connection, Uri via, String contentType, TimeoutHelper& timeoutHelper)

   at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.SendPreamble(IConnection connection, ArraySegment`1 preamble, TimeoutHelper& timeoutHelper)

   at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.DuplexConnectionPoolHelper.AcceptPooledConnection(IConnection connection, TimeoutHelper& timeoutHelper)

   at System.ServiceModel.Channels.ConnectionPoolHelper.EstablishConnection(TimeSpan timeout)

   at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.OnOpen(TimeSpan timeout)

   at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)

   at System.ServiceModel.Channels.ServiceChannel.OnOpen(TimeSpan timeout)

   at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)

   at System.ServiceModel.Channels.ServiceChannel.CallOnceManager.CallOnce(TimeSpan timeout, CallOnceManager cascade)

   at System.ServiceModel.Channels.ServiceChannel.EnsureOpened(TimeSpan timeout)

   at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)

   at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)

   at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)

Exception rethrown at [0]:

   at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)

   at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)

   at Microsoft.Ceres.CoreServices.Services.Node.INodeControllerManagementAgent.get_NodeName()

   at Microsoft.Ceres.CoreServices.Tools.Management.Client.AbstractSystemClient.TryConnect(Uri nodeManagementUri, ICollection`1 locationsToTry, ICollection`1 triedLocations, ICollection`1 nodeSystemManagerLocations)

   — End of inner exception stack trace —

   at Microsoft.Office.Server.Search.Administration.SearchServiceInstance.Synchronize()

   at Microsoft.Office.Server.Administration.ApplicationServerJob.ProvisionLocalSharedServiceInstances(Boolean isAdministrationServiceJob)

After much digging and playing around, it seems this error goes away if I changed the app pool the search was running under to the SharePoint Web Services Default. But this is not what we want, Search should run under its own app pool. So after much more digging, I discovered that this is another bug I have found in SharePoint 2013, fixable by applying the following hotfixes:

http://support.microsoft.com/kb/2567680

http://support.microsoft.com/kb/2554876

http://support.microsoft.com/kb/2708075

http://support.microsoft.com/kb/2472264

Give the server a good old reboot and things should work fine. That’ll save you a few days of trouble.

Peace love and happiness!