Jack’s Experience Using UltiPro Web Services

March 19, 2011

This post originally appeared on a web services demo site.

Jack is my 2 year old son.

My UltiPro Web Services Development Experience

By Jack Allard, Four Leaf Foods Inc.

“Four Leaf Foods.com” didn’t happen overnight. It has been a life-long dream of mine to bring this portal to fruition, and only recently has technology caught up to my imagination.

It all started an eternity ago. It feels like it’s been three weeks, but logic tells me it’s probably only been a couple days. I was taking a much-deserved break from lining up my matchbox cards on the living room table and pointing at things, when I stumbled across a fascinating red device with white knobs buried beneath a pile of rubber dinosaurs and an assortment of stuffed animals. What was this strange relic, and how had it escaped my notice for all of this time? I started to dig it out, feeling as though it were beckoning me, calling out to me.

At last I had it. I instinctively shook it and enjoyed the noise, as if a pile of granular objects had been upset, then settled, cleaning the slate for a new drawing. I spent several minutes gazing at its magnificence from every angle, learning to appreciate its power and elegance. This device is a work of art, and it needed to be appreciated. After a respectable amount of time, I tentatively started to work the shiny white knobs with my fingers and opposable thumbs. My eyes danced with glee as each movement, no matter how subtle, produced results on the slate. Surely the world had never seen a device such as this before. This is clearly the technology to change everything.

Then I drooled on my shirt and took a nap.

Upon waking up in what has been described as a wholly unnatural position, I cast a glance towards my matchbox cars. While still fascinating and deceptively stackable, I’m sad to say that their allure had been tarnished. My eyes had been open, and all I could think about was the Etch-A-Sketch… the shiny red Etch-A-Sketch, and it’s potential impact on my entrepreneurial efforts with Four Leaf Foods. The high-res tablet device seemed to be the perfect medium on which to deliver up to the moment content to everyone interested in the happenings of Four Leaf Foods and its affiliates. It would simply be a matter of drawing the content on a pre-determined number of Etch-A-Sketches and distributing to our customers in secure-unshakeable boxes. It was so simple and so obvious, I couldn’t believe it had already been done. I immediately began to whiteboard the idea using crayons on the wall next to a white board.

It was as if I was living for the first time. My brain was 93% focused on this exciting new endeavor; 3% was allocated to essential life functions, and the remaining 4% was dedicated to finding dangerous places to put my fingers. (It turns out that ELETRICAL OUTLETS, while aesthetically pleasing, should be avoided.) I scribbled. I diagramed. I pouted. I overcame obstacles. I adapted. I worked the problems and created new problems.

Seconds turned to minutes. Minutes turned into several minutes. Several minutes turned into almost 10 minutes. The wall was a disaster. The whiteboard was still clean. My hair was in shambles.  I was missing ½ a crayon and my nose hurt; I still think those last two are related but have been unable to prove it. Despite my best efforts, I had come to a hurtful but unavoidable conclusion: the Etch-A-Sketch Tablet Computer is not such a great means of sharing content with my customers.

The impact of this realization was devastating. All that I had worked for was lost. It had all been a waste; for naught. How could I have gone from such exuberance to such despair so quickly? How could I have been so silly? What was it all for? Was the problem that I was overzealous and ahead of my time, or that visions of grandeur were sadly mislead? These are questions that I could only ask, for I was too depressed to try to reason answers.

My little sister Ellie happened upon me during my darkest time. She crawled into the room and sat across from me looking at me inquisitively, sensing that something was wrong but unable to express her concern. But she tried. She commenced incoherent babbling and arm flailing. I studied her trying to discern the meaning of it all. What was she trying to express? Was the sneeze part of the message or just a coincidence? I wanted to understand, and I wanted a tissue.

Then we shared a moment that I will never forget. Ellie used a futon to pull herself into the standing position, then let go of the futon. She stood there, shakily, uncertain, for a solid 5 seconds before falling backward onto her bum. She didn’t cry. She didn’t complain. She reached for the futon again, and pulled herself into the standing position and let go. She stood as long as she could, maintaining eye contact, then fell on her bum again. She repeated it a third time to drive the point home, and suddenly her meaning became clear: The reason we fall down is so that we can pull ourselves back up. (© Batman Begins, 2005).

Ellie was right, and I was ashamed for needing the reminder. But that’s what family is for, I guess. To get us over the bumps. I was invigorated once again! Ellie saw the dawning of understanding on my face and nodded approval. Her job was done. She got on her knees and crawled out of the room leaving me to my new found enthusiasm. Thank you Ellie.

My thoughts were running rampant. So the Etch-A-Sketch Tablet Computer didn’t work out so well as a global communications medium. Perhaps the internet held more potential? Would this be a good use of a website? Had anyone tried using a website for this type of thing before? I found a new wall to scribble upon, and scribble I did. I scribbled the night away. I scribbled through story time. I scribbled through that time of day when I’m given food to throw onto the floor. I was an unstoppable scribbling machine, each scribble contributing to the greater good, the final result of which became clearer with each scribble. I didn’t know what it was all leading to, but I knew it was going to be good. I didn’t have the same concerns that I had while working the Etch-A-Sketch Tablet Computer problem; I knew this was going to work out just fine. Better than fine, in fact. It was going to be perfect. Revolutionary. Legendary. Some might even say Epic.

As the scribbling progressed, the damaging effects of black crayon became exceedingly obvious, as did the final solution to that which had plagued my thoughts. When at last all of the metaphorical fog had been lifted, I stopped scribbling and looked at that which I had created. Correction: Not that of which I had created, for it was always there, I just had to chip away the pieces (© Rambo 3, 1989).

I stepped back to examine and appreciate the result. I wish Ellie were there to share the moment for it was she who got me to this point. Even though she was absent, I knew that she was elsewhere in the house, probably gnawing on a piece of furniture, knowing that I had succeeded.


It was so obvious! In retrospect, the whole Etch-A-Sketch Tablet Computer thing was a silly idea. Heck, it doesn’t even have Wi-Fi. SharePoint is so much better for tens of reasons. For example, I can put a weather widget on the page. That’s right, weather right there on the page. Now, I don’t’ have to watch the local news to get the weather, I can just look at our portal, and there it is. When it’s sunny out, it shows a sun. When it’s raining out, I bet it shows clouds or something. That’s so much better than looking out the window. The winds of change are upon us, and they shall be embraced!

But what about non-weather related functionality? While I agree it’s going to be hard to beat the weather widget, I’d be remiss if I didn’t try.

Four Leaf Foods uses UltiPro as its HRMS solution. UltiPro is loaded with all kinds of good information about me and my team (although, sadly, it lacks a weather widget). I would really like to display some UltiPro information in my shiny new SharePoint portal. Can such a crazy dream be realized, or will my imagination once again be limited by the confines of technology?

I quickly consulted my UltiPro counterpart at Ultimate Software. While four weeks my senior, I found the representative to be easy to talk to despite our age difference, and knowledgeable in all things UltiPro. We had a good laugh about the Etch-A-Sketch idea, and then exchanged some colorful anecdotes regarding our favorite flavors of Play-Dough.  It was a pleasant and informative experience, and I’d like to think that we both left the conversation as better toddlers.

It turns out that UltiPro now exposes a variety of SOAP 1.2 web services. That’s right, 1.2. Not 1.1. Take a moment to let that sink in. They’re not pulling any punches. They’re using the latest and greatest standards as defined as recently as April 2007. No dinosaurs here.

The services allow my portal to query data and display it in pretty grids throughout the site. (And “Pretty” isn’t an objective term. They are factually, indisputably pretty. Don’t argue with me on this one.) I can also edit and save the data via the services. This allows me and my team to get common things done in UltiPro without logging into UltiPro. Weird, right? Plus, we can keep our eyes on the weather.

The following services are now available:

  1. Login Service – to get an UltiPro secure token
  2. Contacts
  3. Employee Address – The weather widget is based on my address
  4. Jobs – I use this to find all the people that report to me
  5. Person
  6. Compensation
  7. iPhone – to retrieve a photo of myself!
And more are on the way!
Accessing a service is a two-step process, and it takes just a little bit of work. That’s the price of security. We wouldn’t want just anyone downloading our personnel photos.

Steps to calling a service

Call the authentication service to get a secure UltiPro token. You pass in the following information:

a. User name – for  you

b. Password – for you

c. User Access key – for you. This is, basically, a system assigned password. You can get it from the “Web Services” page in UltiPro.

d. Client Access Key – for your company. You can get this on the “Web Services” page in UltiPro.

2. Call the service. Each time you make a service call, you pass in the following as SOAP headers:

a. The token you obtained in step 1

b. Your company’s client access key.

That’s it! Once you have the token, you can reuse it for all subsequent service calls. Our SharePoint portal makes use of several of the services on the main page, keeping me up to date with all things of interest. Because it’s SharePoint, I can add and remove things on a whim. As new UltiPro services come online, I can write new web parts to consume them. All is right in the world.


It’s been a wonderful journey that I will never forget. I woke up a few days ago with plans to do nothing more than dump cereal over my head. I had no idea how quickly things could change. Not only do I have cereal on my head, but now I also have a spiffy new SharePoint portal that tells me the current weather and shows me a picture of myself. I learned about consuming web services and, more importantly, I learned a little something about myself. When the going gets tough, I don’t just throw in the blankie. I try again. I once again must thank Ellie for that reminder.

Now it’s time to move on to my next journey. I don’t know what it will entail, but I suspect it will involve more web services. Yay for UltiPro Web Services!

Happy Coding.



Multi-Index Cache, Revisited

March 11, 2011

When I set out to do the multi-index cache, it occurred to me that instead of using the secondary indexes as mappings to the primary index, I could just store another reference to the same object.

I don’t know why I didn’t choose to do it that way, but now that I’ve been thinking about it for a while, I don’t see why not. It would eliminate a second lookup, and I can’t think of any adverse affects.

So, if there are 3 indexes, then instead of storing one object reference and 3 keys, we store three object references and three keys.

The next time I’m in there, I’ll change it and see if I’m forgetting anything. The API doesn’t change at all, just the innards.

Simple Multi-Index Cache

March 9, 2011

My new BluRayFriend site (still under construction) requires some caching. The server is Windows 2003, so AppFabric isn’t an option.

One of the things I need to store is Ratings. For it, I created a little API that uses System.Runtime.Caching. I can create new single-index caches simply by inheriting from a base class and filling in some blanks.

I started to use that same approach for the user cache, but it didn’t last very long. I have to be able to retrieve users two ways: By Id and by FacebookId. Logging in uses the FacebookId, pretty much everything else uses my own system user id.

This provided me the opportunity to do something that I’ve wanted to do for a long time: Create a multi-index cache interface. The multi-index part means that you can retrieve objects via different criteria.

The trick is the key field. If I need to lookup FacebookId=12345, how do I know what the UserId is to query the cache? You can use Linq to query the dictionary to find the right object, assuming that its there, but then you lose out of the advantages of a dictionary.

The solution I came up with is to have multiple dictionaries, one for the primary index, and one for each secondary index. The secondary dictionaries map one key to another. So, the secondary index for FacebookId maps “FacebookId aaaaa = Id 12345”. When you query a secondary index, it looks up the mapping, then goes to the primary index.

What if the object doesn’t exist in the secondary index? No problem. The index has a delegate that knows how to go get the object by FacebookId. The primary index and other secondary indexes all know how to extract their own key from the object. Once the index pulls back the object, it gives it to all of the other indexes. The primary index inserts or updates itself, and the other secondary indexes extract the key and update/insert themselves too.

If you hit the primary key directly (ie: for the user id, in this example), then it works pretty much the same way. If the object isn’t found, it fires the delegate that looks it up. It then passes the object to all of the secondary indexes, and they each figure out their own keys and update/insert themselves.

If you remove an object from the cache, it too is passed to all of the indexes so that they can update themselves.

Cache Overview

This top-of-the-line excel snapshot shows the single primary index and two secondary. See how the right columns of the secondary correlate to the left columns of the primary. That’s the mapping. Also note that one of the shown indexes is by last name; that’s fictional. I don’t have such an index, but thought another one was necessary to help illustrate.

Why Multiple Indexes Instead of Multiple Caches?

In my example, you could imagine that there’s a UserId cache and a FacebookId cache. The problem is the duplication and the synchronization. If I update a user, do I really want to know that I have to flush or update two caches? And if so, how do I know the ID to flush on each caches? And why would I want two copies of the same object in memory anyway? And why do zebras have stripes?

In essence, a multi-index cache is mutliple caches, but they’re all wrapped up nice and neat.

The Code

The code for the API is less than 120 lines of code. I wrote it pretty quickly, and have spent more time talking about it since than actually doing anything with it. I’m sure it could use a little more work, but not much. It’s pretty solid as is. The few lines of code can be accredited to the ConcurrentDictionary in .Net4 which has the delegation and thread safety built in. Prior to .Net4, I would still base it on a dictionary, but would have to code the thread safety and delegation.

To make it accessible, I created two base classes.

   1: public abstract class MultiIndexCache<T> where T : class, new()

   2: {

   3:     protected abstract IEnumerable<Index<T>> GetSecondaryIndexes();

   4:     protected abstract Index<T> GetPrimaryIndex();

   5: }


   7: public abstract class Index<T> where T : class, new()

   8: { 

   9:     public abstract string GetKey(T cacheItem);

  10:     public abstract T Getter(string key);

  11: }

For each cache, you implement MultiIndexCache<T>, and for each Index within the cache, you implement Index<T>. In my case, T is BluRayFriendUser.

Through a funny coincidence, the code for the cache and the index comes up to about 120 lines. The implementation of the user cache is also 120 lines. Neat.

Anyway, here’s the full blown implementation. The following code contains:

  1. UserCache, which is the cache set. The indexes are private classes within UserCache
  2. Private class: UserCacheIndex  P Index<BluRayFriendUser>. This is a convenient baseclass for each of the two indexes. Both indexes require an IUserProvider.
  3. ByUserId : UserCacheIndex– the primary index
  4. ByFacebookId : UserCacheIndex – the secondary index
   1: namespace BluRayFriend.Api

   2: {

   3:     using System.Collections.Generic;

   4:     using BluRayFriend.Api.Cache;


   6:     public class UserCache : MultiIndexCache<BluRayFriendUser>

   7:     {

   8:         /// <summary>

   9:         /// Base class for the UserCache indexes

  10:         /// </summary>

  11:         public abstract class UserCacheIndex : Index<BluRayFriendUser>

  12:         {

  13:             protected IUserProvider UserProvider { get; private set; }

  14:             protected UserCacheIndex(IUserProvider userProvider, string indexName)

  15:                 :base(indexName)

  16:             {

  17:                 this.UserProvider = userProvider;

  18:             }


  20:             protected UserCacheIndex(string indexName)

  21:                 :this(new DbUserProvider(), indexName)

  22:             {

  23:             }

  24:         }


  26:         /// <summary>

  27:         /// ByUserId index

  28:         /// </summary>

  29:         public class ByUserId : UserCacheIndex

  30:         {

  31:             public ByUserId(IUserProvider userProvider)

  32:                 : base(userProvider, "ByUserId")

  33:             {

  34:             }


  36:             public ByUserId()

  37:                 : base("ByUserId")

  38:             {

  39:             }


  41:             public override string GetKey(BluRayFriendUser cacheItem)

  42:             {

  43:                 return cacheItem.Id.ToString();

  44:             }


  46:             public override BluRayFriendUser Getter(string key)

  47:             {

  48:                 return this.UserProvider.GetUser(int.Parse(key));

  49:             }

  50:         }


  52:         /// <summary>

  53:         /// ByFacebookId index

  54:         /// </summary>

  55:         public class ByFacebookId : UserCacheIndex

  56:         {

  57:             public ByFacebookId(IUserProvider userProvider)

  58:                 : base(userProvider, "ByFacebookId")

  59:             {

  60:             }


  62:             public ByFacebookId()

  63:                 : base("ByFacebookId")

  64:             {

  65:             }


  67:             public override string GetKey(BluRayFriendUser cacheItem)

  68:             {

  69:                 return cacheItem.FaceBookId.ToString();

  70:             }


  72:             public override BluRayFriendUser Getter(string key)

  73:             {

  74:                 return this.UserProvider.GetUserByFacebookId(long.Parse(key));

  75:             }

  76:         }


  78:         /// <summary>

  79:         /// Retrieve a user from the cache

  80:         /// </summary>

  81:         /// <param name="userId"></param>

  82:         /// <returns></returns>

  83:         public BluRayFriendUser GetUser(int userId)

  84:         {

  85:             // from the primary index

  86:             return this.GetItem(userId.ToString());

  87:         }


  89:         /// <summary>

  90:         ///  From the facebook index

  91:         /// </summary>

  92:         /// <param name="facebookId"></param>

  93:         /// <returns></returns>

  94:         public BluRayFriendUser GetUserByFacebookId(long facebookId)

  95:         {

  96:             return this.GetItem("ByFacebookId", facebookId.ToString());

  97:         }


  99:         /// <summary>

 100:         /// UserId is the primary index.

 101:         /// </summary>

 102:         /// <returns></returns>

 103:         protected override Index<BluRayFriendUser> GetPrimaryIndex()

 104:         {

 105:             return new ByUserId();

 106:         }


 108:         /// <summary>

 109:         /// Return a list of secondary indexes. There's only one.

 110:         /// </summary>

 111:         /// <returns></returns>

 112:         protected override IEnumerable<Index<BluRayFriendUser>> GetSecondaryIndexes()

 113:         {

 114:             return new List<Index<BluRayFriendUser>>

 115:             {

 116:                 new ByFacebookId()

 117:             };

 118:         }

 119:     }

 120: }

The UserCache class implements GetPrimaryIndex and GetSecondaryIndexes. Those methods simply provide all of the indexes to the base class.

UserCache has two of its own methods that make the caches accessible. GetUser(int userId) and GetUserByFacebookId(long facebookId). Each of those methods make a call to base method GetItem, passing it the name of the index to query on, and the key.

The indexes each have a name. The name is defined as a constructor parameter. To use an index, you have to refer to it by name.

The indexes implement two methods

  1. GetKey – this method is responsible for looking at an object and creating a key for it. The UserId index returns UserId.ToString(), and the FacebookId index returns FacebookId.ToString()
  2. Getter(string key) – if the item isn’t found on the index, then the getter method is responsible for retrieving and returning the item

And that’s pretty much it for coding one of these things. A couple of subclasses, and a few overrides.

I didn’t want to be require that the keys be strings, but it ended up being the most practical. This implementation is using a ConcurrentDictionary, but if I want to swap out the store with System.Runtime.Caching (or even AppFabric), for example, then the keys would eventually have to be comes strings anyway.


Its using Concurrent Dictionaries, which handles the concurrency and delegation automatically.

If two threads request the same user via different indexes at the same time, then it will load the same object twice. One thread will get it by user id, and the other will get it by facebook id. All that means is that the object is going to get inserted then updated very quickly. The updated object will be identical to the object that was inserted, but that’s ok. All subsequent requests will get the one from the cache.

As for keeping the dictionaries in sync: Its self maintaining, especially since there isn’t an expiration policy in this version. But, even if that weren’t the case, it would still work just fine. Each index as a delegate that knows how to retrieve the object. So even if two tables are out of sync for the nanosecond that you make a query, the delegate will fill in the blank.

Things to Watch Out For

The objects on the cache should be immutable. It is a hashtable, so if one of the properties that comprises the cache changes, the it might no longer be found. Furthermore, you usually don’t want someone pulling an item from the cache and altering it. They may not understand that they’re altering the object that everyone is using.

The most obvious way to deal with this is to clone the object before returning it. The base class can do this automatically if the object implements ICloneable. Or, if the cache developer wants to do so himself, he can do so by transforming the result of this.GetItem().

The only other thing that has to be considered is nulls, an issue which I have momentarily chosen to ignore. The values that comprise a primary index should always exist; otherwise its just not a valid primary index. In my case, everyone has a user id. But, what if a user doesn’t have a facebook id? Currently, the Getter for the facebook index will return a null, then the cache will try to store the null. In that case, it needs to recognize that the user is not retrievable by facebook id, and not try to update anything. To get fancy, it should also know that it previously attempted to get a particular user by facebook id, and not try a second time (unless a flush occurred).

Expiration Policy

System.Runtime.Caching already had a bunch of expiration policies and the api to support it. I wouldn’t try reinventing that. Rather, I’d replace the concurrent dictionaries with the System.Runtime.Caching objects and take advantage of it there. My single index cache uses System.Runtime.Caching and takes advantages of the expiration policies. It’s a nice feature. Previously, that api was only available in ASP.NET. Its nice to have it out and about.

Live Cache Update

I’ve covered this in previous posts, but its really neat so I’ll repeat it. In the case of the RATING cache, the AVERAGE RATING needs to be recalculated each time someone saves a rating for a movie. I want to do that asynchronously via messaging so that the database doesn’t get assaulted, yadda yadda. And, when the average recalc is complete, I want all web servers to be notified so that they can update their rating cache. This is done via Simple Bus: http://someguysoftware.wikidot.com/simple-bus

   1: private void SetupSubscribers()

   2: {

   3:     // fires when a user rating changes

   4:     Bus.Default.Subscribe("RatingChanged", (m) =>

   5:     {

   6:         var changed = (RatingChangedEvent)m.Object;

   7:         ServiceFactory.GetReviewService().CalculateAverageRating(changed.ProductId);

   8:     });


  10:     // fires when an average rating has changed

  11:     Bus.Default.Subscribe("AverageRatingChanged", (m) =>

  12:     {

  13:         var changed = (AverageRatingChangedEvent)m.Object;

  14:         Caches.Ratings.SetAverageRating(changed.ProductId, changed.RatingAverage);

  15:     });

  16: }

The preceeding code is in the Global.asax. The first subscription is responsible for the recalcuation. Really, that shouldn’t be done in the web process. It’s temporary until I have somewhere else to put it.

The CalculateAverage method publishes a message when its done with the recalc. The message type is called AverageRatingChanged, and it too is caught in the global.asax. The second subscription updates the rating cache. If there were 10 servers, all 10 would receive the same message and update their caches.


There are tons of caching solutions out there, a lot of them working in memory. In these days of AppFabric and alternatives, a distributed cache should be strongly considered. But, if you just need a stupid little in-memory cache, and you need multiple indexes, then this works easily and very well. By swapping out the store, I could use multiple indexes against any of the caching APIs including AppFabric.

And, more importantly, it was fun. I’ve been tasked with dealing with some caching issues in another capacity and have often wondered “what-if?” and “why-not…”, and this was a good opportunity to exercise those questions in 120 lines of code.

Caching / Simple Bus

March 5, 2011

BluRayFriend development continues at a snails pace. Its really hard to get excited about it because it’s a website, and I hate working on websites. I’m exited about the functionality I hope to offer, but not about all of the HTML it takes to get it going.

A couple weeks ago, I created some cache objects. They were just wrappers for System.Runtime.Caching; it put a little structure around it so that I can simply write some subclasses and be good to go.

Then I ran into a need for a multi-index cache. I need to be able to retrieve users by there BluRayFriend user id, and also by their face book id. Its all the same users, so I didn’t want two caches for it.

To address that, I wrote about 120 lines of code that resulted in a thread-safe multi-index cache. It using ConcurrentDictionary in the background, but I kept the keys as string (which I mentally debated) so that it can be used with other things, such as System.Runtime.Caching, that require string keys. That object is pretty neat; I’m really happy with how it turned out.

A couple nights ago I forced myself to sit down and try to get something done on the site. What I wet with was the completion of the star-rating system (netflix style). I found some code a few weeks ago to get the starts to work well, and I created the AJAX stuff to save the values. The remaining thing to do was to display the AVERAGE RATING using the start control. I had it hard coded to 4.

I go that going using the Rating cache (which is the subclass of the earlier single-index cache). When you update your rating for a movie, it needs to

  1. save the rating
  2. update the average rating in the db
  3. update the average rating in the cache (or eliminate the item from the cache so that it reloads on next hit. the former is obviously more efficient).

The link that I will eventually paste into this post contains more information, but in short, I didn’t want to just use a trigger or some any other mechanism that may result in an onslaught if someday the site has hundreds of users. I wanted to offload the recalculation and the update. That part’s easy; we can do that any number of ways. The trickier part is updating the cache. If the recalc is done out-of-process, then how do you let the website know to flush the cache? What if there are many websites that need to be flushed? How do we solve this?

There are probably several ways to solve it (like a distributed cache such as app fabric, perhaps). But, for various reasons none of those appealed to me. What I really wanted was a message bus. Should I find one or write one?

I’ve looked at NServiceBus in the past, and did some write-ups about it as work. It would easily do what I want to do, but not do it the way I’d want to do it. As great and successful as it is, it lacks middleware that would allow me to drop things onto the pipeline for message manipulation, etc. Every client is aware of all of the other clients and enqueues messages to all the proper queues. I would prefer to enqueue the message once and have something else work it out.

As I have mentioned in a previous post, late last year I spent a good deal of time working on AwBus, which is an incomplete ESB. It incompletely does all of the things that I would like to be able to do on an ESB. But, it got too big and I ran out of steam. I had other projects to do. One of the next major features of it would’ve been a MSMQ (and other) based pubsub. So, I’ve been thinking about it for a long time. Before looking at NServiceBus and NeuronESB, I never fully appreciated the power of MSMQ in advanced scenarios. But, as is usually the case due to a genetic disorder, I look at those things and think they should’ve been done slightly different, and I wonder why they weren’t done the way I think they should be. One way to find out is to try it.

So, here I am working on this stupid movie site that’s taking 3 months longer than it should’ve, and I realized that I wanted live-cache update, and I want to do it via a message bus that can solve not only the cache problem but all other types of problems, and suddenly I no longer dreaded working on the site. Oh no. Opportunity has knocked. Time to write a message bus.

So began SimpleBus: http://someguysoftware.wikidot.com/simple-bus

“AllardWorks.Com Inc” officially dissolved last year. I’m still using the email; I had it for a decade before I ever incorporated the name. But what does AllardWorks mean, really? It has my last name in it, so it implies that either I work or that the stuff I do works. Ok. Recently I started thinking about rebranding. I thought of a conversation at work when someone asked me “Who wrote that?” regarding a software package. I responded “Some Guy”. That’s all that was needed to express that it wasn’t a major software package released by some conglomerate. Its just something someone put together on their own time, or as part of a really small company that may or may not last to the weekend. I liked how that sounded even thought the impression it leaves isn’t overwhelmingly positive when seeking a new product. So, I bought “SomeGuySoftware.com”. BluRayFriend and SimpleBus are the first two project-that-will-probably-fail under the new branding. The site will go live sooner or later, and it will absolutely be running using SimpleBus. What isn’t as clear is if I’ll end up exposing SimpleBus to any interested parties. I’d like to, but these projects tend to get out of control and I run out of steam. But, at least it works and is live and in use.

That’s it for my rambling. Some of what I just spewed is also on the simple-bus blog, but it has more details about the bus itself.