Thursday, November 15, 2012

Resolved error “The following system error occurred: No mapping between account names and security IDs was done.”

After getting our new TFS 2012 server built out I found myself unable to grant permissions to the Tfs_Analysis database in SQL Analysis services. When attempting to add a new user of the local Active Directory domain to the TfsWarehouseDataReader role I received the above error.

I found that the role membership contained a user that was unresolvable via Active Directory and showed up as a SID only (no domain\name). Removing that unresolved user allowed me to add new groups to the role from the local AD.

It would appear from this that the logic is revalidating all of the users and groups that are listed in the membership role list when adding/removing members.

Sunday, September 2, 2012

Installing Visual Studio 2012 on Windows 8

I’ve run into a few quirks installing Visual Studio 2012 on Windows 8. The first is that the ISO image doesn’t appear to use the NTFS file system and Windows 8 will not mount the image because of that. Fortunately, my Win8 install is a VM which allows me to mount the image on my host PC and run from there.

Once I ran it that way, the installer started and ran partway through; however, it ended prematurely with an error that it couldn’t find a required object. Looking through the error logs didn’t give me enough detail to resolve it. When I clicked the provided link to look for common solutions, I found others had the same issue (with screenshots to match).

Right now, the only solid way that many people have come to install is to use the Web installer. It works but it’s slow. If I find an update that solves the .ISO install problem, I’ll post an update here.

Monday, August 13, 2012

Motorola S9-HD Headphone Continual Beeping - FIX

Recently my Motorola S9-HD Bluetooth headphones started doing something annoying. It started on a bike ride where I really wanted some music. Periodically they began a long continual series of beeps. If I pressed the power button the beeps would stop, but not for long. Un-pairing and re-pairing them didn't change anything.

What I found that did work is to completely drain the battery. Completely...meaning leave them on until they power off, then turn them back on until they turn off again. I then did a full charge on the battery. I found that I had to repair them again to my phone - not sure if that's normal or not. Once I did that, they started working again properly.

So far so good and the problem has not returned.

** EDIT **

Not long after I posted this, the problem returned. The best I can determine, the headphone batteries weren’t holding a good charge. I ended up putting them in my pile of recycle. Now I’m on a quest to find a suitable replacement. I’ve looked at the next model up from Motorola the S10’s – I might go with them.

Sunday, July 22, 2012

The Road Back


Not long after my last post on predictive analytics, the small company I worked for let me know that they were experiencing serious financial difficulties - serious enough that they couldn't make payroll. Worse, they didn't know when they could.

It was a hard decision to come to, but the truth of it is I work to make a living. So it was out of a need to support my family that I started the dreaded job hunt. Thanks to a weak job market, that prospect didn't excite me too much.

Some friends of mine have worked for a very successful Microsoft partner here in Phoenix. Over the past couple of years they had encouraged me to come talk to them about the opportunities there. So I did and a few weeks later started a new chapter in my career.

The technologies are cutting edge, the projects are plentiful, and the people are amazing. I'm really looking forward to this new opportunity and and blessed to have a group of folks like this to work with.
I'm looking forward to getting back to active blogging and feel like things are really looking up.

/imapcgeek

Friday, April 20, 2012

Preparing for Predictive Analytics - Data is the Key

The Basics

As I mentioned in my last post, I’ll be making a series of posts on some of the challenges that you will face when embarking on a predictive analytics project. In this post, I’m going to focus on what may be obvious to most, but frequently has proven to be a challenge for the customers we have worked with. Namely having ready access to the required data.

Your organization’s data is the key to a successful predictive analytics project. Quality historical data is required in order to build models that will let you make predictions about the likelihood of some event or behavior. Not all data is relevant in all modeling scenarios, but generally the more information you have the better. The modeling exercise will weed out the noise from the signal. Some modeling techniques are better than others at dealing with the noise as well. Familiarity with the capabilities of your tool is very important in this context.

Cataloging Your Data

How many of you know all the different kinds of data used in your company? My experience has been that most organizations have a lot of data in silos that are not well documented and certainly not well integrated with other corporate data. This data can range from duplicate customer data, to sales data, to communication data such as email logs, marketing data, or other data needed to GSD (Get Stuff Done). Master data management projects can help your organization centralize, de-duplicate, and cleanse that data, but the reality is that these kinds of projects can take years to complete and are very complex. Minimally, it would be helpful to begin with a data cataloging project to at least get your arms around the data that your organization has. Start with the basics. Identify the types of data you have and who is responsible for maintaining it. Make note of where the data lives; i.e., what tool/platform was the system developed in. Besides laying the ground work for a master data management project down the road, this will be extremely valuable in your predictive analytics projects because it will outline where you need to go to get the information you want to model.

Does It Matter How the Data is Persisted?

The answer to this is highly dependent upon the tool or platform you are using in order to create your models and then to subsequently score the data. If you are using a commercial tool, your choices are limited by what that tool needs. Generally, the best answer is to bring the data together into a consistent storage medium. Whether that’s a relational database, a data warehouse, flat files, or XML files, the important thing is that the data can be accessed and interrogated as a set. Efficient set based operations are critical to the performance of your analytics solution. Many of the modeling activities will involve slicing, dicing, counting, aggregating, and transforming your data set in numerous ways. This can be a very slow process if your data is not stored in a way that supports those kinds of activities.

Analytics Repositories

My recommendation is that whenever possible you should try to collect your data into a centralized analytics repository. With smaller data sets this is much more approachable and is a common way to do it. However, with very large scale enterprise data this can be expensive, time consuming, and impractical. The time to update the repository can make timely model analysis impossible especially if you are trying to model transactional data that is quickly being added or changed.

Wrapping Up

In my next post, I’ll be discussing two different approaches to designing analytics repositories that address these two scenarios. The first approach is to use the simpler relational repository to store the data set. The second approach is to use a virtualized metadata-driven repository, which can be extremely useful in the larger scale enterprise settings.

Tuesday, March 20, 2012

Preparing for Predictive Analytics

It’s been awhile since I have written here because I’ve been heads down on project work and have just had a chance to come up for air. I want to begin with some short posts to discuss a number of challenges that I’m currently working through. The major theme will focus on how companies can prepare for predictive analytics. If you aren’t familiar with the term, here’s a quick link to the Wikipedia topic to get you started.

About a year ago I joined a small company that is focused on delivering top-notch predictive analytics software to address some very specific commercial and educational market needs. Since joining the company as the solution architect, I have found that the single most complex and dynamic part of our engagement and solution delivery process is data acquisition, normalization, and access. This is largely because of the diversity in the platforms, technologies, and applications written for and used by our clients.

We work with our clients to analyze their data for the purposes of building predictive models. The requirements for model building are pretty straightforward. We need a clean and consistent view of the data - requirements that are not unlike any other analytical or reporting process needs. But the complexity of today’s enterprise environments make this more challenging. Additionally, we aren’t looking at a snapshot of data at a single point in time. We are looking at the data in real time or near real time in many cases. We are often working with both structured and unstructured data that come in a myriad of formats and accessed using many protocols. Everything from relational data in databases, to web services, to flat files, spreadsheets, etc. All of the information needs to be identified, cataloged, gathered, date/time stamped, and recorded for time based analysis.

Clients that understand master data management and have sound data governance policies are easier to work with because they understand the value of their data and most importantly how to get it. At the other end of the spectrum are those companies that have their data in many disparate systems, have no data governance or ownership policies, and don’t know the value of their data. Getting access to their data and getting it into a clean and consistent form can be quite a challenge.

Therefore, my next few posts will talk about the challenges we are facing and our approach to solving the data integration and normalization needs for a predictive analytics solution. My hope is that the information you find here will help you prepare for using predictive analytics in your organization to improve and optimize the decision making you do on a daily basis using one of your company’s most valuable assets – your data.

/imapcgeek

Tuesday, January 31, 2012

WCF Test Client Error

When starting to debug a new WCF service I received the following error message:

“The contract ‘IMetatdataExchange’ in client configuration does not match the name in service contract, or there is no valid method in this contract.”

image

I had added my MEX endpoint in the web.config and was able to review the service WSDL through a browser, so I was really not sure why I was getting this. As it turns out, this error message is caused by an entry in the machine.config for an endpoint declaration defined like this:

<endpoint address="" binding="netTcpRelayBinding" contract="IMetadataExchange" name="sb" />

I’ve commented it out for now and the WCF Test Client is now able to properly interrogate the service metadata without giving me an error.

After a bit of Google’ing this error, I found this explanation from Joel C on StackOverflow.com. Looks like at some point an installation for the .NET Service SDK (which I don’t remember installing) updated the machine.config with that information.

For now, it works having removed it.

Monday, January 30, 2012

You Are Dead to Me

OK, even if Silverlight isn’t completely dead, it certainly is starting to smell pretty bad. Mid last year we were looking at Silverlight as being a viable solution for our UI needs. It has a rich programming model.  The user experience is excellent and it has pretty decent market penetration – certainly not as good as Flash, but respectable.

However, as the majority of our development is greenfield and we are looking to build for the future, it just didn’t make long term sense for us to consider building on a product whose future was looking pretty iffy. When you consider that Microsoft recently cancelled Mix 2012, it’s clear to me that the future lies elsewhere.

Sure, you can still build on Silverlight. And sure it will be supported for quite some time to come. But I asked myself why build on a technology that is clearly questionable in its future? Certainly Microsoft is continuing to support XAML development for Metro style applications. But I questioned whether I was just delaying the inevitable.

In the end, I felt the best decision was to go with HTML5 for the applications we are building for the future. We have begun active development in ASP.NET MVC3 with the Razor view engine. We have achieved our goals of providing a rich user experience, excellent performance, and a testable loosely-coupled code base. It’s also a solution we can work with right now, which means we can deliver business value right away without sacrificing functionality.

v.Next of our applications will give us the opportunity to revisit the scene and reevaluate the options. I think the best news is that there is an excellent set of options out there now and the future looks very bright indeed.

Wednesday, January 18, 2012

Creating a Dependency Injected WCF Data Service

In this post, I’ll explain how to decouple your WCF data service from a specific data context. This is useful in many ways including changing out your context implementation at runtime or mocking out your context for testing purposes. The example code will use MEF (Managed Extensibility Framework), but any dependency injection framework, service locator implementation, or factory pattern could be used.

Getting Started

The first thing you will need to get started is an implementation of DbContext. I’ve chosen to go with a code first approach in this example because it gives me explicit control over the code in the context. Entity Framework 4.1, which is available as a Nuget package, provides a simple, purpose-driven API, which allows me to create a new DbContext and specify the entity types that it is responsible for in just a few lines of code. The base class and EF4.1 plumbing does all the hard work.

public class PersonContext : DbContext
{
public IDbSet<Person> People { get; set; }
public IDbSet<Address> Addresses { get; set; }
}

Figure 1: Basic PersonContext implementation

This implementation satisfies the most basic requirements of EF4.1 for a DbContext, but as you’ll see as we go along, we’ll want to flesh it out a bit more in order to support MEF’s requirements.


Give Your DbContext an Interface


Uncle Bob Martin’s SOLID object oriented principles is a must-read for all programmers. The ‘D’ in SOLID is for Dependency Inversion. In a nutshell, dependencies between objects should be based on abstractions not concretions.  Our next step is to create an interface that will represent our DbContext.

 

public interface IDbContext : IDisposable
{
IDbSet<TEntity> Set<TEntity>() where TEntity : class;
int SaveChanges();
}
Figure 2: IDbContext interface

 

Initializing Your Context


The EF4.1 DbContext class can take a connection string in as a constructor parameter. For simple cases, this might be enough for you. However, if your context requires additional information to operate properly, you may want create a configuration class that you can inject into your context through its constructor. I’ve take this approach because it allows for greater flexibility in initialization of the context.


[Export]
public class DbContextConfiguration
{
public string Name { get; set; }
public string ConnectionString { get; set; }
}

Figure 3: DbContextConfiguration needed to initialize your DbContext

 

[ImportingConstructor]
public PersonContext([Import("PersonContextConfiguration", typeof(DbContextConfiguration))] DbContextConfiguration configuration)
: base(configuration.ConnectionString)
{
_configuration = configuration;
}
Figure 4: PersonContext constructor

 

The first thing you’ll notice in the code above are the [Export], [ImportingConstructor], and [Import] attributes that are applied to the DbContextConfiguration class and PersonContext constructor. These are MEF attributes, which are used to support the dependency injection pattern. MEF will handle auto-magically wiring up the dependencies through a call to ComposeParts(). If you are new to MEF or unfamiliar with how it works, here are a couple of useful links to get you started.

 



 

Creating Your WCF Data Service


Writing a WCF data service couldn’t be easier. In a matter of a few mouse clicks, you can be serving up REST-based data. Microsoft has dramatically reduced the effort required to implement a service by providing a rich API that provides a lot of functionality under the covers.


image


Figure 5: Add New Item


Adding a new WCF Data Service through the Add New Item dialog results in an entry point to that API via the DataService<T> class.



   1: public class PersonDataService : DataService< /* TODO: put your data source class name here */ >
   2:     {
   3:         // This method is called only once to initialize service-wide policies.
   4:         public static void InitializeService(DataServiceConfiguration config)
   5:         {
   6:             // TODO: set rules to indicate which entity sets and service operations are visible, updatable, etc.
   7:             // Examples:
   8:             // config.SetEntitySetAccessRule("MyEntityset", EntitySetRights.AllRead);
   9:             // config.SetServiceOperationAccessRule("MyServiceOperation", ServiceOperationRights.All);
  10:             config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V3;
  11:         }
  12:     }

Figure 6: Boilerplate data service code


Simply replace the boilerplate TODO between the generic template brackets with a class that extends DbContext and you’re practically good to go. They even provide code snippets via comments that serve as placeholders for configuration changes that you can use to set access rules for the entity sets exposed by your DbContext.


*Note: in my example above, the DataServiceProtocolVersion.V3 is implemented in the October 2011 CTP release for data services. I began using it in order to integrate with the Entity Framework 4.1 DbContext API.


In our case, we will declare the data service this way:


public class PersonDataService : DataService<IDbContext> {…}


Override CreateDataSource


Next is the final piece of the puzzle. The DataService class provides a convenient way to control the instantiation of your service’s data source dependency. Namely - we will override the CreateDataSource method with an implementation such as this:


protected override IDbContext CreateDataSource()
{
context = compositionContainer.GetExportedValue<IDbContext>();
return context;
}

Figure 7: CreateDataSource implementation

 

Your implementation may differ, but essentially you want to use your MEF composition container, service locator, or other factory implementation in order to get an instance of your context based on the interface type. MEF has numerous ways for you to resolve this dependency.

 

A critical concern for your code in this area is to handle the multiplicity of dependencies you may find in your container. In a future post, I’ll demonstrate how to specify metadata to filter the results of your composition request.

 

Conclusion


Creating a loosely coupled design can sometimes be a challenge. In the case of WCF Data Services, Microsoft had the foresight to make this chore easier. Through abstraction of your data context, use of an extensibility framework like MEF, service locator or factory implementation, and a little glue code, you can decouple the service and context and reap the rewards of runtime composition and increased testability.