Oracle Sequence Promotes Poorly Maintainable Code

I started my professional career working with Microsoft’s SQL Server. I spent twelve years off and on learning how to design a database and write stored procedures in T-SQL. Then this last October, I switched jobs and was exposed to a new database platform, Oracle. Since this switch, I have used every curse word I know and invented new ones to express my frustration at interacting with an Oracle 10x something database. I’ll leave the rest of my rantings for another post and just focus on one aspect of Oracle that has frustrated me recently.

I started creating my first new table in Oracle and started defining the columns. I always start with an ID column that is typically used as primary key of the table. As I went to select the column type, I didn’t see anything labeled “autonumber”. Trying again, I looked for integer, but that isn’t there either. Oracle only supports the “Number” column. There, you can provide the precision before and after the decimal point. After selecting the number column, I looked all over for something that would mark the column as unique and set for an autonumber sequence. Striking out  quickly, it was time to ask Google and start learning about Sequence objects.

Oracle tables have no built in mechanism for auto numbering. Instead, you must create a separate unique Sequence object and use it each time a record is inserted into the database.


Then each time the sequence is used, it looks something like

INSERT INTO customers (ID, Name…) VALUES (customers_seq.nextval, ‘Hogan Haake’…);

Comparing this to the SQL Server I’m used to, if a column is autonumbered, you just exclude it in the insert and it automatically gets the next ID on insert.

INSERT INTO customers(Name, …) VALUES (‘Hogan Haake’…)

At this point, Oracle people could argue that I’m just lazy, or I just need to learn a new way. They are right on both accounts, but there is more to the story! I recently came across some bad code in part of my application where the developer didn’t use the sequence.nextvalue for an insert, instead converting the current date into a number [YYMMDD Format] and inserted that into the table as a unique value. While that method worked, the unique number they were generating was quite far away from the current sequence. The system has been in production for two years now and the sequence number is about 6 months away from a “collision” with incorrectly inserted manual numbers in the ID column.

Current Sequence Value           Manual Sequence Value
107,000                                      120,210     (first inserted 2012-Feb-10th)

The current sequence value is fast approaching the first manual sequence value. It was fortunate that the bug was found before it caused corrupt data and long nights for me. Due to the complexity of the system and time constraints, the simple fix of  incrementing the next value of the sequence to 500,000 to avoid any future collisions with “unique” numbers was chosen. It would be nice to fix the offending code with the correct sequence number, but management decided the code worked enough that we could move on to other problems.

In a SQL server environment, if you try to insert a value into an autonumber field, an error is produced preventing this type of error from happening.

I’m not sure what other issues I’m going to encounter with this new environment, but I sure miss SQL Server. If you still don’t think SQL server is better, consider community support. Who would you rather trust for help?

Pinal Dave (Sql Server) or Don Burleson (Oracle)?


Blog Engine .NET Tag Cloud Optimization

For all of the amazing blogs and websites that are out there, hundreds exist that are just average. Most of us are not experts, like Eric Lippert, who can write with authority on a single topic for years and have interesting and new things to say. Instead, there are blogs like mine. I write about whatever is interesting to me at the time. As the years progress, what I write about changes, and I want my website [Blog Engine .NET] to reflect that.

As I create new posts, they are shown on the front page till they get old. However, I have found that the default Tag Cloud widget is only interested in what I have written about most. For example, I took a motorcycle trip with my brother, so I wrote extensively about it. Since that trip almost two years ago, Arkansas was the top item on my Tag Cloud. I have nothing against that wonderful state, but I felt that it made my website out of date.

I have made a simple one line code change to the Tag Cloud that makes it more relevant. Instead of using tags from all 190 posts on my site from the last 2 years, I changed it to only consider tags from posts that are less than 1 year old. After this change was made, the Tag Cloud immediately updated and more accurately reflected what I have been writing about recently.

Now for the details, I’ll assume you’re familiar with Blog Engine .NET:

1. This example is using version 2.0, I have not tested it with other versions.
2. Go to the file (from the root of the source code download) BlogEngine.NET\widgets\Tag cloud\widget.ascx.cs
3. Using your favorite text editor (because we don’t need to compile anything) go to line 197.
  a. Method private static SortedDictionary<string, int> CreateRawList() method
      The old method

foreach (var tag in Post.Posts.Where(post => post.IsVisibleToPublic

With a minor modification highlighted, we can adjust the time to just the last year’s worth of posts.

4. With the minor change in the method, save the file, and upload it to the same relative path on your Blog Engine .NET installation and next time you call a page with the Tag Cloud on it, it will automatically re-compile and the cloud will be up to date.

Sorry that I used images instead of text, but I wanted the color to come out nice. If you’re nervous to make the code change yourself, you can download the one file to install yourself. (1.84 kb)

Happy Coding!

Hogan Haake

Where Did Part Of My File Go?

I’ve primarily been a Windows software developer for the last 12 years. During that time, I’ve written lots of web sites, desktop applications, and server applications. I recently changed jobs and am now working mainly on an IBM mainframe and Unix. I occasionally get to do some Windows applications, but they are few and far between. Lucky for me, today was one of the days when I got to work on a Windows application. It was a simple job moving a file from a server, doing some minor processing and then FTPing it to a Unix server for final processing.

In my attempt to be a good developer, I spent a significant portion of my time on the application testing and placing try catches to make the application as safe as possible from any issues. As part of the safety I decided to validate that each file I FTPed to the Unix server did indeed get there. After I finished the FTP, I did a directory listing “ls -l” equivalent to “dir” in DOS. The listing came back with all the files in the directory and their size.

I wrote a loop to compare the file sizes between the Windows files on the local machine, and the Unix files transferred. To my surprise, none of the files I transferred were the same size! I was perplexed by this. After downloading the plain text .csv file from the Unix server to a different folder, I checked the properties, and it was indeed smaller that the original file uploaded. I tried to open it figuring that it would fail, but to my surprise, it opened correctly. I looked at a sampling of the lines in the file and they had the same data.

I was quite frustrated at this and Googled around looking for answers. There were many posts about encoding of files being different. Research in this realm brought me no closer to the answer. The files both appeared to be encoded ANSI. I decided to resort to the lowest level of debugging I could think of. I downloaded a hex editor to look inside the files and see what they had for data. There in the hex, I compared file next to file and found the issue that has been eluding me for a while.

Windows terminates its lines with CHR(13) CHR(10) [Carriage Return, Line Feed]
Unix terminates its lines with CHR(10) only [Line Feed].

I was loosing one byte of size per new line in the file I uploaded. It seems that Unix converted my file upon FTP upload. When downloaded and tested on my Windows machine, the smaller file was able to be processed by Microsoft Excel opening up the .csv file. This explains why the file size was smaller, but all of the data was still assessable.

Windows Hex Output. Note the highlighted square “OD”. That is hex 0x0D chr(13) for carriage return. It is followed by 0x0A chr(10) for new line.

Note the Unix file below in the same position only has the 0x0A chr(10) for newline, but the carriage return has been stripped out. This accounts for the file size difference.

In order to continue my quest for few errors and validation of my data, I wrote a method I’ll share with you to get the Unix file size.

/// Get the size of a file on a Unix system. This entails counting all of the
///CHR(13) charcters and subtracting that from the overall
/// size as Unix doesn’t use them in a newline. Stupid Unix!
/// </summary>
/// <returns></returns>
private static long GetUnixFileSize(string fileName)
    long windowsFileSize = new FileInfo(fileName).Length;
    long unixFileSize = windowsFileSize;

    using (StreamReader sr = new StreamReader(fileName, true))
        char[] c = new char[1];
        const char char13 = (char)13;
        while (sr.Read(c, 0, 1) == 1)
            //If we find a Carriage Return, decrement the count.
            if (c[0] == char13)
        }//while (sr.Read(c, 1, 1))
    }//using (StreamReader sr = new StreamReader(fileName, true))

    return unixFileSize;


Hope this saves some of you time!


Getting rid of SPAM comments in BlogEngine.NET

I have been running this website on BlogEngine.NET version for over a year now. I like the interface and the familiar .NET coding environment. One of the biggest drawbacks to this blog software is the proliferation of SPAM comments that are entered on my site. I began to take for granted that I would never get rid of them. Every a few days, I got notified that there was a new comment on a post I made. Nearly every post was SPAM. I decided it was time to fight back. I didn’t want to loose control of the comments to a third party, or invent/borrow my own captcha system. Some day it may be necessary, but I don’t particularly like them. When I looked at my web logs for the site, I noticed that there were several searches for “powered by” and then some random term.

And they found my page because the footer of the page looked like the following

This got me thinking. I’m using the standard theme (Indigo) provided during installation of I decided to try modifying the master page for this theme. Once I modified the theme, I have stopped getting SPAM comments to my site. If you’re interested, you should do the same.

1. Download the file /themes/Indigo/site.master
2. Change the text on line 64 FROM

<div class=”footer”>
        Powered by <a href=”” target=”_blank”>BlogEngine.NET</a> <%=BlogSettings.Instance.Version() %> |
        Original Design by <a href=””>Arcsin</a>, Adapted by <a href=””>RazorAnt</a>

To something like

<div class=”footer”>
    Thanks for visiting

Then re-upload the site.master page and your site will have something like

Give it a try, but you might not be happy with the results. Now that I changed the text, I don’t have any comments on my entries. Now to get more traffic to my site!


Business Process Re-engineering, A Business’s Quick Weight Loss

If there is one thing that this world is not short of, its quick weight loss plans. We are flooded with pictures and video of hard bodied individuals with before and after scenarios. If you can just follow this one plan through, you too can loose 50 pounds and 19 inches across your whole body. (Disclaimer: Results not typical (or possible)). Even with the disclaimers, these programs sell to thousands of people. Yet, the world is getting fatter. In 2011, the “skinniest state” rose .7% in overall fat. [1] In 2011, only two states had negative obesity rates! [1] Weight loss and business have the same thing in common, they are both products of their inputs versus their outputs. For weight loss, its calories, for business, its currency.

If you’re wondering what all this has to do with business, hang on while I finish my snack and diet soda.

Looking first to the world’s most trusted and accurate source of information, I looked at Wikipedia to see what some of the business fads have been. According to their article, the following are in order of apperance from 1950 – 1990: [2]

    Management by objectives
    Matrix management
    Theory Z
    One-minute management
    Management by wandering around
    Total quality management
    Business process reengineering
    360-degree feedback

There are some other honorable mentions, but the list above is quite comprehensive for the business fad diet. Lets look into one of the current ones that is re-appearing, re-engineering/business process reengineering.

The 1993 book, Reengineering the Corporation, was a smashing success selling 1.7 million copies in less than 2 years. [3] With the fad in full swing many companies were signing on to reengineer. Business Process Reengineering (BPR) was placed to be the next big business thing. Hammer took what he already saw existing businesses doing and created a business movement out of. One of the best and most attractive parts of the management fads is that they often mask complacency. If organizations constantly improved and innovated each day, there would likely be no need for massive BPR projects. BPR is the perfect fix for complacent managers. Early on in Hammers book, he states “You don’t reengineer unless you have to”. [3 p. 12]

No diet plan is successful without a spokesperson convincing you that you’re fat and need to do something about at. Hammer came out early and often in 1993 and 1994 with interviews stating provocative things like “It’s basically taking an axe and a machine gun to your existing organization.”[4] Many other non business violent phrases got people thinking about the idea.

Most binge dieters crash and go back to their old ways like many companies do. Paul Strassman pointed this out in a scathing 1994 article about BPR. “Hammer’s simple methods, much swifter than BPI’s more deliberate approach, are preferred by the impatient and those not compelled to cope with the long-term consequences of what happens to the quality and the dedication of the work force.” [4] Strassman fears that BPR will alienate the workforce that doesn’t get cut by the results of the reengineering. He likens it to a democracy turned dictatorship. Ripping teams apart has this effect.

One of the tenents of BPR is assigning employees to a BPR team. Those employees are tasked for several months to a few years to completely re-work a process and make it significantly better. Hammer makes the point in his book that members of the BPR team should not expect to go back to their orignial departments after the BPR process is completed. [3 p. 62] Any manager that knows the employee they are giving up is not going to come back to their team is very unlikely to give up their best employee(s). The manager would be better suited to give up their lesser employees as they need valuable people to get the current work done. Holding back the best employees would not help a BPR effort, yet its expected that managers would willingly give them up. In this instance, incremental improvement would seem a better choice for keeping valuable employees and getting more work done.

If companies don’t want to give up their best employees for reengineering, they they will have to go to consultants to get their work done. Hammer points out that general consulting firms often don’t put their best and brightest employees on reengineering assignments. [3 p. 71] This leaves companies with lesser consultants that still cost top dollar. If the process does succeed, there is a risk that the new process is only understood by consultants and not by an organization’s core employees.

Finally, like all diet plans, they come with a disclaimer. Hammer addresses the top 10 mistakes that are made during BPR. During the chapter about mistakes, he creates a disclaimer stating that it can only fail if you don’t do it right. “The results depend entirely on the quality, intensity, and intelligence of the effort”. [3 p. 14-15] So when my BPR project doesn’t live up to what it should have been, its my fault. More importantly, why did I let my business get this bad in the first place. Businesses like diets are a daily affair. If you don’t pay attention, you’ll have picked up 10 pounds around the holidays and have to deal with the results!

[1] –
[2] –
[3] – The Reengineering Revolution, Michael Hammer and Steven A. Stanton
[4] –