Application Server – Pubsub Workign

April 7, 2013

I haven’t really worked on my latest personal project in a few weeks. The application server is up and running, and is completely command driven, and the commands can be specified declaratively. 

I started to implement the new SEMANTIC LOGGING BLOCK from P&P, and created a custom sync that sent the log items to MSMQ. That worked, but I want to be able to process the messages multiple times. IE: put them in a database, and also monitoring tools and the website. So, I needed pubsub.

Since that realization, I haven’t worked on it too much. I’ve spent a few minutes here and there, but tonight was the first time I put some serious time into it while watching “MASK OF ZORRO”.

The plan is to have 2 implementations out-of-the-box

– MSMQ – the best choice, when available

– NETTCP – a less reliable backup for when MSMQ isn’t available.

The nice thing is that both implementations can use the same contracts. The only thing that will change is the bindings. Yay for WCF! But, I have been focusing on MSMQ.

As of tonight, it’s working, but there’s still a lot more to do. The client is pretty configurable, but the server isn’t yet. There’s a few hard coded things in there that have to go. All of this can be driven by the WCF configuration files, and that will be an option, but I want it to just work without people having to mess with anything. I have some work to do on that end.

There are two WCF services running on the app server: 1 to handle subscribes, and 1 to handle publishes. A lot of the work I did previously payed off nicely here. I created my WCF contracts and a MSMQPubsubFeature. I put two attributes on the feature, one per service, and now the services automatically start… just as they were supposed to. (It defaults to NETTCP, so I added WCF configuration to the app.config to switch it to MSMQ.)

The publish service:

public class PublishService : IServerPublishService

The subscribe service:

public class SubscribeService : IServerSubscribeService

The feature class:

public class MsmqPubSubFeature : Feature, IPubSubFeature

Each service has a name. The feature has a name. When the feature starts, it starts the two WCF services by name.

I setup a simple test application that uses the pubsubclient (a stand-alone assembly) to exercise it, and it’s working as expected.

Progress made.




Something I wrote in 1996

April 3, 2013

I’ve been trying to clean up and organize some old files and archives. I wrote this in March 1996 when DVD was being introduced.

As I started reading it, I immediately groaned. These days, I would never presume anything to be “entertaining” or “fine”, I’d just write it and let readers judge from themselves. Oh well.

From the Dawn of Time,

To D.V.D…

The Evolution of Mass Storage

An Entertaining Documentary written by the  fine people of

 Allard-Works Enterprises.

Note:This is written to be somewhat entertaining, but the technical content is accurate.

D.V.D. is an acronym of varying meaning.  To some, it means Digital Versatile Disk.  To others it means Digital Video Disk.  And to a few, it sounds like a brand of quality underwear.  Whether the V actually stands for Video or Versatile depends on where you read it.  However, because D.V.D. does so much more than just video, Versatile is much more appropriate.  As for the underwear people… don’t worry about them.

D.V.D. is the latest entry in the evolution of mass storage.  As are many things when they are first created, it is a phenomenal technological breakthrough that will soon have a significant impact on the multi-media world.  But before you, the reader, can fully appreciate D.V.D. and all it’s splendors, a quick refresher course of computer history is in order.

In the beginning, there was nothing.  The universe was an infinite void, much like the interior of Rush Limbaugh’s head.  And then there was a magnificent explosion of matter, tossing planets and galaxies every which way, filling the infinite void with a great number of floating rocks.  After a few million years, one of those rocks metamorphosed into a prettier rock, which was shortly thereafter named the planet Earth.  The Earth’s molten-lava surface cooled down allowing forests to rise up from the ashes, and oxygen and hydrogen combined to create a substance that we now use to wash our cars.  Though there was no one around to realize it, it was in those early years of the Earth’s childhood that the first form of mass-storage was created.  Those same forms of mass-storage are still around today.  We call them lakes and oceans.  When you think about it,  we have to keep the water somewhere!

For a long time, the Earth revolved around Ra, not really doing much other than chatting with the moon.  Meanwhile, on the surface, activity was afoot.  After a few failed attempts at creating significant life, a few amino acids and proteins finally got together, had a party, conjugated, and created the original Jurassic Park.  Dinosaurs ruled the planet for more than 6 months, and then another one of the universe’s rocks killed them.

After that, computer history varies depending on your religion (or lack thereof), but, one way or another, homo-sapiens began their reign.  Humans all across the world soon began to communicate with one another, form societies, discuss problems, and kill each other.  After many years of this, an inspirational man with a powerful vision came along to teach his brothers and sisters of the world.  This man believed himself to be the son of God, the almighty messiah, and many worship him to this day.   Over the years he has been called many things by many people… but most of us just call him Bill Gates.

Humans, it is believed, first created computers on this planet.  Now, let’s explore why.

Data, otherwise referred to information, is the basis of nearly every activity in existence, ranging all the way from going to the bathroom, to controlling a space shuttle.  At about the same time humans overcame their fascination with fire, they began to realize the significance of data, and the importance of storing it for future reference.  This lead to the creation of writing which is, essentially, data storage.  People have been writing for at least 2000 years, and data has been accumulating for the entire duration.  Humans are fascinating creatures in the aspect of their overall desire to learn, creating more data based on old data, and constantly progressing.  Industries were born and flourished, and the more data we created, the more we needed, and wanted.

This whirlwind of data eventually led to the discovery of electricity, Pop Tarts, and atomic structure, which later led to the computer.  A computer, in simple terms, is a device that stores, manipulates, and utilizes data to produce different data that the user desires.  The difference between the computer and humans is the speed in which they do this.  Also, humans can’t play Doom with just a pad of paper.

To break it down even further, computers work with electrons, which as most Physic majors will attest, are very very small.  For each of the billions of instructions the computer processes, each one breaks down into a simple term: YES or NO.  Yes is represented by the presence of a few electrons in a specific area, and NO is represented by the lack of electrons.  Because electrons are so small, millions of these yes and no questions answers can be stored in a very small area.  As computer technology progresses, the number of electrons necessary to indicate a YES decreases.  Someday, in the near future, only one electron will be needed to perform this task.

Each of these YES and NO results are called a bit.  Eight of these bits form a byte, which is usually the smallest piece of data the average user uses.  For example, each character you read on this paper is represented by one byte, which consists of 8 yes and no’s.

The problem with computers is that they require a constant feed of electricity to maintain the integrity of it’s bytes.  Thus, when the computer loses power, it loses it’s memory.  Early on someone realized that this would be very bad in most cases, and technology developed ways to store the computer’s data.  As far as the personal computer is concerned, this led to the invention of the floppy disk in the early 80’s.  The first consumer friendly floppy disk could hold 360k (360,000) bytes of data on a 5.25 inch disk.  That is a massive amount of data.  Time progressed, and being as greedy as we are, we wanted more storage space.  In August 1984, a company called IBM introduced a 1.2M drive.  (1 million, 2 hundred thousand. pronounced “1.2 meg drive”).  This disk stored four times as much as it’s predecessor in the same amount of physical area.  A 400% improvement.

At the time, 1.2M was incredible, and it is likely people never thought that they would need any more.

In 1986, the next floppy disk was introduced: The 720k.  720k, representing 720 thousand bytes, may seem a downgrade from previous successes.  However, the significant thing about the 720k is the fact that it is a much smaller, much more durable disk.  In relation, twice as much data as was held on the original 360k floppy could then be stored on a smaller disk.  In 1987, the 720k became the 1.44m drive.  Nine years later, the 1.44meg drive is still standard.

As floppy drives developed, so did another handy-dandy mechanism called the hard drive, or, fixed disk.  A floppy is just a disk that you can insert, use, and remove.  A hard disk, on the other hand, remains inside your computer.  It is much faster and holds a lot more information than a mere floppy.  The earliest hard drives could hold 10 megs of data… that’s 10 million bytes, or 80 million bits.  Again, everyone thought this was fantastic.  In the July 1989 Computer Shopper, 10 megs was considered mass-storage.  However, as the storage capacities increased, so did the amount of space needed for computer software.  Thus, between 1989 and 1996, hard drive sizes have risen from 10 megs, to in excess of 1,600 megs (1.6 Gigs), and are continuously growing in size while depreciating in cost.

Currently, CD-ROMS (Read only memory) are extremely popular.  The great things about CD-ROM is that they hold 650 megs of data on a 4.72 inch diameter disk which is only 1.2mm thick.  (Incidentally, it is reported that this size was decided upon because it is precisely large enough to hold Beethoven’s Ninth Symphony.)  Six-Hundred and Fifty Megs!!!  Now we’re playing with power.  This kind of mass storage on such a small, lightweight, portable device has opened up an entire new world in the realms of computer graphics and audio.   In the days of floppies, one would have to insert many disks into the drive during a long and tedious installation process.  With CD-ROMS, you just insert one disk and let it rip.

Let’s see how a CD works.  CD’s store data by using PITS.  A pit is a microscopic depression in the disk.  Where there’s a pit, there’s a YES.  Where there isn’t, that’s a NO.  These pits are 0.12 microns deep, and 0.6 microns wide… about the size of Rush Limbaugh’s brain.  A laser system is used to scan the disk in specified areas to see if a pit resides there or not, and returns the result to the computer system.

How much information, exactly, can a CD hold?  Well, I already mention that it’s 650 megs of data.  To give you an idea in practical terms, one CD can hold 74 minutes of high-fidelity audio, or 333,000 pages of text, or a combination of the two.

Some of us have been exposed to “LASER DISKS” which are very large CD’s that you can also use as a bathroom mirror.  Laser disks work on the same principles as CD’s, except they’re primary purpose is video.  Over the last few years, more and more companies have been putting actual video onto the CD, incorporating it into presentations and games.  The term “interactive movie” is becoming increasingly popular in the “intertainment” world.  Actors and actresses are being hired in place of the artists who used to have to conjure up fictional characters for their games.

SIDEBAR:  This doesn’t really fit into the paper anywhere, but think about this:  Years ago, the computer industry and the movie industry were at opposite ends of the spectrum.  Movie companies would hire actors and actresses to participate in a story.  Software companies would hire authors to create dazzling graphic and game characters.  Today, the two are meeting in the middle of the spectrum, and starting to pass each other, each advancing towards where the other began.  As I said, interactive movies are becoming more and more popular.  Two recent MAJOR games, Wing Commander III and Wing Commander IV hired a sleu of big time actors to  bring a video game to life.  Malcom McDowell, Mark Hammill, and Ginger Allen among them.  Meanwhile, “CG” (computer graphics) in the movie industry are beginning to replace actors and actresses.  In the movie Mortal Kombat, for example, there was a fight scene between two characters, where actors weren’t used.  It was all CG.  Similarly, in CASPER, most of the characters were computer concoctions, along with some key scenes from Jurassic Park.  The mother of all CG movies so far has been TOY STORY.  100% Computer Graphics, and it was a huge success, inspiring such things as the upcoming Disney movie “JAMES AND THE GIANT PEACH”, which uses a great deal of CG.  Recently, the movie JUMANJI dazzled audiences with it’s own computer generated jungle wild life.  Hundreds of thousands of lines of computer code were written for the lion alone.

            This leads me to the final point of the sidebar.  So far, CG has created dinosaurs, monkeys, cartoon characters, lions, and a distant shot of humans.  How much longer do you think it will take until someone creates a life like CG Human?  There is a man in L.A. (name lost) currently creating a humungus program which is, essentially, a CG human.  He is accurately reproducing all muscle groups of the human body, and how they interact with each other.  When this is done, he will have a programmable human that will, on the screen, do anything he tells it to, and the body will react just as a human will react.  As of January 1996, the head and upper right part of the body were done, with only one problem: the shoulder.  Apparently (if I remember correctly), much like gravity, no on understands how the shoulder muscle group functions.

            Envision a day 10 years from now when movie companies won’t have to pay 20 million dollars for a big name star.  All they’ll have to do is define the parameters of the body, then tell it what to do.  What will happen to all the big time actors then?  They’ll probably be in video games.


For years, people have been trying to cram as much video as possible onto a CD-ROM.  Unfortunately, it hasn’t been very succesful.  A CD can hold about twelve seconds  (mostly a guess, I lost the actual figure)  minutes of uncompressed video and audio.  Since most movies are in excess of 90 minutes, something had to be done.  That something turned out to be MPEG (Motion Pictures Experts Group).  MPEG can compress at a ratio of 200 to 1 maintain high quality.

How does compression work?  Good question.  Just a brief answer:  Compression uses a few bytes to represent many bytes.  This is a loose example that I am making up on the spot, but let’s pretend that you have a document with the word THE in it 50 times.  That 150 characters, or, 150 bytes.  Using compression, every THE might be replaced with ~, bringing the bytes needed down to 50 characters instead of 150.  Then, every time the software sees ~, it replaces it with THE.  This is an extremely uncomplicated example, but the principle is accurate.

MPEG is much more complicated example of compression.  If you’re watching a movie, and someone is talking, what is changing about the video?  Not too much… just the speakers mouth, and maybe a little head movement.  MPEG pays attention to these changes.  Instead of storing an entire picture frame, it only stores the part that has changed from one frame to another.  So, the less movement in a film, the more tightly it can be compressed.  If you’re watching a basketball game where everything is panning and moving, then the compression isn’t as good.  High end home computers can do MPEG compression.  However, the second generation of MPEG is in full swing.  MPEG 2 produces even tighter compression, and it requires hardware to do the compressing.  MPEG 3 is soon coming.

Back to CD’s.  Even with all of this major compression, there still isn’t nearly enough room on a CD to fit an entire movie.  A lot of companies were considering breaking it down into two CD’s, but it never really happened.  Recently, however, music groups such as THE CRANBERRIES are incorporating videos on their CD’s in addition to the audio tracks.

The original concept for CD’s was concocted way back in 1983.  Thirteen years later, it is an extremely crucial part of most computer systems.   Thirteen years old… that’s ancient in the computer field.  It would stand to reason, as history dictates, that it’s time for a change.  That change is upon us, as you read in the opening paragraphs of this essay.  That change is D.V.D: Digital Versatile Disk, and it’s coming soon.

D.V.D. is the next step in the evolution, and it is a direct descendant of the CD.  Over the last few years, many companies have had different ideas on how to increase the storage space of the CD.  One idea was a two sided CD, with two lasers, and a disk twice as thick.  Another idea was a double layer CD, but only one laser.  The laser would just focus on the appropriate level, and ignore the other.  Another idea was to just make bigger CD’s with more, smaller pits.

The result is a dual layer disk of the same thickness (1.2mm), but a diameter that extended from 4.75 inches to 5 inches, and pits that have significantly decreased in size.  This, combined with the MPEG 2 Video compression standard results in a 4.7Gigabyte single layer disk, and a 9.4Gigabyte dual layer disk.  Single layer will provide for 133 minutes of full quality video along with three sound channels, and 4 subtitle channels.  The dual type disks will allow for 4 1/2 hours of the same.  As for speed, the slowest DVD player will be 8x to 10x faster than the original CD player, which is the speed that audio players use.  The fastest CD-ROM player of today is only 8x.  The jump from CD to DVD is monstrous, and the beautiful part is that it’s backwards compatible.  You’re low-storage 650 meg CD-ROMS will operate in the mass-storage 9.4 Gigabyte DVD drive.

D.V.D. will be making it’s debut in late 1996 or early 1997, and costs will begin at $500 to $700.  As more companies produce them, the price is sure to drop, and the speed is sure to go up.  Unfortunately, D.V.D. is still a WORM device (Write Once, Read Many) but, at last, entire movies will fit on one disk.

In 1983, a floppy drive could hold 360,000 bytes of data.  In 1996, D.V.D can hold 9,400,000,000 bytes.  Why will we ever need more space than that?  Well, as history shows, in another 13 years we will probably be dealing with terabytes.  Why?  Consider this:  The invention of CD-ROMS allowed people to put huge amounts of data onto a CD-ROM.  This sufficed for quite a while, and eventually people saw how much space they had, so the technology advanced to the point of filling the disk.  Then, occasionally, more than one disk was required.

The same thing will happen with D.V.D.  It will suffice for probably a few years.  People will put massive amounts of video into games, instructional applications, or other multimedia endeavors onto one single layer disk.  They’ll see how much space they have left, and continue to push, adding more and more.  Eventually, the dual layers will be full, and things will start coming on 2 D.V.Ds… then there will be something new.

I hope you enjoyed our little trip down memory lane, and our glance into the future.