With the advancement of Open Source applications into closed source platforms, the integration of the two categories of development models become more and more intertwined. The weight of these different development models are beginning to be more and more evenly distributed, that it’s hard to avoid merging these two solutions in modern day enterprises.

One of the most popular open source software is the MySQL database. It has continuously been considered by both small and large corporations to be deployed in internal as well as distributed applications.

On the other side of the spectrum, we have Microsoft Office. Microsoft’s premier office applications suite that continues to be the most dominant electronic format for documents, spreadsheets and presentations.

In this article, I’m going to demonstrate how we can use Microsoft Excel to build a simple report from data stored on a MySQL database.

Requirements

Requirements for this article are as follows:

  • A MySQL installation configured to allow TCP/IP access. This installation can be on any platform supported by MySQL.
  • A Windows XP “client” that has Microsoft Excel and MyODBC installed.

Preparations for MySQL

In this article, Excel will be used to access data stored on a MySQL database hosted by a remote machine. This is why the MySQL installation must be configured to allow TCP/IP access.

For the purpose of this article, it’s presumed that the MySQL database is installed on a Linux host called rdsblinux.

The table that will be used in this example is called employees and it will reside in a database called test (created by default in a standard MySQL installation). The SQL dump for this table can be downloaded here. Basically, it contains a list of name, extension number and departments of people working in a purely fictional company.

For accessing this data, we will be using a MySQL user with the following details:

  • Username: reporter
  • Password: reporter
  • Host: any (%)
  • Priviledges: SELECT on test database

It is recommended that reporter to not have other priviledges than outlined above because this user will only be reading data and not do any writing to it.

Preparations for Client PC

We need to add an ODBC data source for our MySQL database. Here are the steps on how we can create this:

  1. Click Start, point to Settings, and then click Control Panel.
  2. Double-click Administrative Tools, and then double-click Data Sources (ODBC).
  3. Under the User DSN tab, click Add.
  4. In the Create New Data Source window, select MySQL ODBC 3.51 Driver and click the Finish button.
  5. In the Connector/ODBC window that appears, enter the following details (the information are as mentioned earlier):
  6. Click on the Test button to check that the settings are correct.

Accessing the Data from Excel

Start Microsoft Excel and create a new document. Go to the Data menu, select Import External Data and click on New Database Query (as shown on the image below).

In the Choose Data Source window, select excel example and click OK.

In the Query Wizard window that appears, expand the employees tree and select the namedepartment and extension columns to be included in the query (refer to the following screenshot), and click Next.

Just click Next on the resulting Filter Data and Sort Order screens. At the Finish screen, just click the Finish button. When asked where to place the data, select cells A1 to C1 (as illustrated below) and click OK.

Taraa! Your spreadsheet should now contain the names, department and extension number for staff in the company.

Real World Implementations

Although this article is admittably simplistic in nature, the real world implementations of using the MySQL ODBC Connector with Excel is virtually endless. Some examples that I can think of (and/or have done before) include:

  • Generating sales reports and graphs
  • Automatically creating mailing lists from a membership web site
  • Designing an interactive security report generator using conditional formatting to highlight critical areas

If you were inspired to create your own MySQL driven spreadsheets from reading this article. I would love to hear what you did and how it has benefitted you. You can do so by commenting under this article.

Coding Like a Girl

No, that’s not an insult. What I’m actually referring to is this post from Creating Passionate Users. I think that the captions for the first two pictures was accidentally switched, but don’t take my word on it. I’m not the artsy type 😛

Anyway, I’m a professional programmer (my official title is Analyst Programmer, but don’t mind the technicalities), however, I don’t think that I fit the stereotype of your average professional coder. I do comment my codes, albeit sparsely. Of course, when it comes to web pages or web apps, I go to great pains to ensure that the output is 100% standards compliant.

Most of the time, I do thrive to make my codes as clean-formatted as possible. Always think of the guy/gal that’s going to inherit your work (if you think you’re going to do this job forever, I pity your lack of ambition). To me, it’s not a question of beauty or grace. On the contrary, it has got to do more with science rather than art.

You see, to me beautiful code is functional code with as little cruft as possible. A fairly competent programmer should have at least a basic understanding of what a particular class or procedure or call does by simply glancing over the lines it contains. You’ll know where to place your comments if you plan your code in this manner. If you had a hard time coding a particular section, you know it’s time to drop in a comment or two.

Now back to the title of the linked post; Code like a girl. Is this even true? At the risk of being labelled as a chauvinist, I have to say that the few lady coders I knew, every single one of them writes messy code. Probably explains why female coders are very rare, especially in Malaysia and Singapore. However, being the smarter sex, they instantly found niches in consultancy and sales. This is where guys suck big time. They probably can construct solid, elegant products. But most guys, especially on the technology front, are extremely poor at convincing potential customers why their product is a must have.

This is where I’m in total disagreement with the author. Beautiful code is almost always notgirl code. I know that the author uses this phrase as a figure of speech. However, it definitely does not reflect reality… at least not mine.

Conference Call Services

In this age and time when people are getting more mobile, the traditional meeting is a near impossibility. Which is why I feel that conference calls are a godsend. Given the right equipment and proper arrangement, conference calls could even be a better experience compared to meetings.

When arranging conference calls, there are a few key elements which you need to take a look at:

  • Location
  • Quality of Service
  • Number of parties involved
  • Cost of implementation

Arranging a conference call requires meticulous planning, not unlike organising a traditional meeting. Each party should already have the meeting minutes beforehand and be prepared to discuss key issues.

As for professional conference call services, there are a few to choose from. Companies in Malaysia could opt to use services provided by TIME dotCom or TM to name a few. Feel free to comment on other companies you know of that provides conference call services.

In my post Common Web Site Structuring Mistakes, I did mention that robots.txt tend to deceive its function of restricting access to certain areas of your web site by search engine crawlers, more commonly known as robots. Instead, the robots.txt file might actually do more revealing rather than restricting. In light of this, is there any other way to get the function of robots.txt without actually revealing too much information? Lucky for you, there is.

Say hello to the robots meta tag… Same functionality as robots.txt, but far less obvious. Here’s how one would look like:

<meta name="robots" content="noindex,nofollow">

As with all meta tags, the above would go in the <head> section of your HTML. What that particular line does is to inform the robot that the particular page which has this meta tag should not be indexed, and the links in this page should not be crawled. This will effectively cause compliant robots to virtually ignore the existance of this page.

Yes, I did mention “compliant robots”… and the last time I checked the robots/crawlers from major search engines such as Google, Yahoo!, MSN, Altavista, etc. do support the robots meta tag as well as robots.txt. Heck some of the more “evil” robots don’t even follow the robots.txt rules, these robots are often used as email address harvesters by spammers.

Another item you might want to consider adding to your robots meta tag is noarchivecontent. This tells compliant search engines not to cache the contents of the page. Thus if your page appears on, let’s say Google’s search results, pages with the noarchive directive won’t have a Cached link under it.

So in summary:

<meta name="robots" content="noindex">
This tells the robot not to index the page.

<meta name="robots" content="nofollow">
This tells the robot not to follow links on the page.

<meta name="robots" content="noarchive">
This tells the robot not to archive the page.

<meta name="robots" content="noindex,nofollow,noarchive">
Multiple directives are possible and the options should be separated by commas.

A final caveat: Just as not all robots obey the rules of robots.txt, the same thing applies to using the robots meta tag. There’s no 100% sure way to exclude your pages from being found, other than not to publish them at all, of course.

When browsing through web sites, especially corporate web sites, there are a few things that make me go “Eurghh!”. Surprisingly though, not many of these things are aesthetics related, to me they are more of a structural issue. Here’s my list of Common Web Site Structuring Mistakes in no particular order whatsoever.

  • 100% Flash content
  • Overly Complex Dynamic URLs
  • Hardcoded Presentational Layer
  • Lack Of Contrast
  • False Standards Compliance Advertisement
  • Overly Restrictive robots.txt
  • Not Updating the Front Page
  • Use of Frames/Iframes
  • Non-descriptive Titles
  • Using Popups for Important Pages

Mistake 1: 100% Flash Content

First of all, I don’t have any deeply seeded hatred for all things Flash. Wait a minute… I do! I hate flying widgets and Flash banners with a passion.

However, I’ll be the first to concede that Flash has its uses and if used properly, it’s an excellent presentational tool. The problem with Flash is that it’s often used for all the wrong reasons. Flash advertisements are irritating. I’ve yet to find someone who disagrees with this statement.

Creating an entire web site with Flash is very much possible. Hell, I have to maintain one myself, mostly due to bureaucratic reasons (I am making progress in convincing the relevant parties to convert it to XHTML though). It provides pleasing eyecandy, and if constructed properly, can provide an easily navigable interface.

The major problem with constructing a web site entirely in Flash is that it is not search engine friendly. Search engines don’t “understand” Flash content, to put it simply. That is why it is important to have descriptive text content to go along with your Flash content.

The Flash content is for humans whereas the text content is for search engine spiders. If for some reason you don’t want the text content to be seen by your visitors, you can use the CSS property display: none to hide it from them. Search engines would still be able to see it though, and that is what matters the most.

Mistake 2: Overly Complex Dynamic URLs

Granted, this was an old-school search engine indexing limitation. All popular search engines now have the ability to spider dynamic URLs. However, I’ve yet to see a search result that has more than five embedded embedded variables.

To illustrate this, let me give some examples:

  • URL A: http://www.example.com/prod.php?item=1&category=12&vendor=13&code=88&promo=xmas
  • URL B: http://www.example.com/prod.php?item=1
  • URL C: http://www.example.com/1/brand-new-gadget/

If you were to bookmark the URLs above, which one would you be most comfortable with? I’d figure that most people go for URL C. It’s short, simple and descriptive. Another plus is, if you somehow forgotten to bookmark it earlier, search engines tend to find it easier compared to URLs with dynamic URLs such as URL A and URL B.

By the way, URL A has five embedded variables whereas URL B only has one. More importantly, stay away from them whenever possible. If you have some basic programming background, you might want to check out mod_rewrite if you have web sites hosted on Apache. If you’re on IIS, then I’m afraid I can’t help you much. I’ve yet to seen a free URL rewriting module for it.

Mistake 3: Hardcoded Presentational Layer

This is quite a common phenomenon. My only question is, “WHY?!”. I mean, if this was 1997 then yeah, tables are a perfectly good way to place elements of a web page. And yes, <font color=”blue” size=”10″>this is way cool</font>.

The good news is, it’s no longer 1997 and we have nice toys like CSS so that we no longer need to mash our presentational layer along with our content. The bad news is, too many people are still trapped in the mid-90s. This is further compounded by the fact that most WYSIWYG HTML editor doesn’t output valid XHTML or integrate proper CSS support. Just when you thought it couldn’t get any worse, tens of thousands of “budding web designers” sharpen their “skills” on such tools, and see web standards as “getting in the way of getting ‘real’ work done”.

My advice is if you’re serious about authoring web sites, then you have to be serious about fulfilling W3C standards as well. In all honesty, if you’re arsed enough to actually learn something, it wouldn’t hurt to learn it properly now, would it? For instance, you could still drive a car on first gear alone… but is that the proper way to drive?

Mistake 4: Lack Of Contrast

This one literally drives me blind! I’ve seen more than enough dark gray fonts on black background as well as light silver fonts on a white background to last me a lifetime. Believe me, there’s nothing “l33t” nor “artistic” about those stupid colour schemes! There are also web sites that are way too colourful for even a kindergarten activity room.

The human eyes assist in making sense of a presentational message, be it pictures or text, by comparing the primary patterns with its surroundings. The easier it is to differentiate the “main message” from the “backdrop”, then the easier it becomes for our brain to make sense of what the eyes are feeding it.

If you feel that you have a daft sense of colour, you might want to check out Steel Dolphin Color Scheme Tool. It will help you pick a colour scheme for your web site based on a primary colour of your choice.

I’d personally go for the Complimentary, Split Complimentary or the Triadic schemes most of the time. You don’t really have to strictly follow the generated scheme. Slight hue variations of the generated scheme might be more to your tastes. The important thing is to experiment… sanely!

Mistake 5: False Standards Compliance Advertisement

Seen any of these images recently:

Seen any of these images recently:

  • Valid XHTML 1.0 Button
  • Valid CSS Button

You won’t believe how many pages that display the buttons above actually validates! Try clicking those images the next time you see them. Chances are, many of the web sites that display them don’t really use valid XHTML and/or CSS. To me, this is one of the biggest forms of fraud that is plaguing the internet, especially the Malaysian blogosphere.

I mean, if you know that your pages don’t validate, then why the heck do you still stick those links on your site? It’s like sticking an “EVO IV” sticker on your Proton. Only an idiot wouldn’t notice the difference, but to the rest of the population, you are the idiot.

So please, before you proudly display those standards compliance images on your web pages, double check to make sure they are really compliant.

Mistake 6: Overly Restrictive robots.txt

First of all, in order to understand robots.txt I’d recommend that you visit this page.

As you can see robots.txt is a very important tool that will determine how your pages are indexed by search engine spiders. Now, the problem is that a lot of people use downloadable CMS scripts or other site management tools and build their web sites on those scripts. Some of these scripts bundle their own robots.txt, and more often than not the default settings are way too restrictive for most web sites.

Another funny thing about robots.txt is that in theory, it’s supposed to “protect” certain areas of your web sites from nosey search engine robots and other crawlers. However, more often than not, it provides nosey visitors hints on where to start digging on your web sites for “goodies”.

Let’s take the example of PHPNuke, one of the most popular CMSs available on the web. Take a look at its robots.txt contents. You’ll notice in there that important areas of PHPNuke are disallowed to robots. However, to us humans, we now know where they are located.

Perhaps a more interesting example would be Whitehouse.gov’s robots.txt. My oh my oh my…

Mistake 7: Not Updating the Front Page

The front page of your web site, also oftenly called the home page, is the doorway to your other pages. It should provide links not only to key pages of your web pages, but preferably a summary of important areas which would entice a visitor to click visit these areas. However, there’re many web sites out there that don’t make proper use of its front page.

Imagine the following scenario; You have a web site at http://www.example.com and a “News” section at http://www.example.com/news/. In your news section you have an entry about your company’s recent acquisition of a competitor at http://www.example.com/news/2005/12/27/we-acquired-xyz-company/.

Wouldn’t you think that this is very notable news and should be announced to the whole wide world? Then would you “hide” it under a few layers of links? Don’t you think that it should be placed prominently on your front page?

Therefore, it is of absolute importance to update your front page with current news, eventhough you might have a dedicated page just for such purposes.

Mistake 8: Use of Frames/Iframes

This is another web technology that’s considered passe. Whatever a <frame> or an <iframe> can do, <div>s can do it about just as well. If you need dynamic updating of its contents, then a little bit of AJAX can emulate that behaviour pretty well.

If you need more convincing on why you should drop frames in your web pages, consider the following scenario; A page that uses two frames will need at least three HTML files:

  1. The frameholder (where you define the <frameset> element).
  2. Contents of the first frame.
  3. Contents of the second frame.

In addition to this, link targets need to be defined properly. A simple mistake could result in an external link to be opened in a site menu frame. I’m sure that you have encountered this problem every once in a while. It can be quite irritating to see a web page squashed in a small frame.

Furthermore, search engine spiders tend to index the contents of your <noframes> tag better than the actual frames themselves!

Mistake 9: Non-descriptive Titles

Perhaps one of the more easier issues to fix. The contents of your <title> element is very important. It should provide a brief description of the page it’s on. Many lazy web authors template the titles of all their pages, thus making it the same for all pages in their web site.

A common misconception about the page title is that it’s unimportant because it’s merely displayed in your browser window’s title area but not on the page itself. Hence, the preference to template it rather than making it dynamic.

Some even went to the extent of making it cartoonishly dynamic by manipulating the title tag with JavaScript to do irritating ASCII animations! This will irritate most users to no end, so please, stay clear of doing so!

From a visitor’s point of view, the page title is virtually invisible. However, for search engines, the title tag is all important. It should concisely describe the page that’s being returned as a search query result. Here are some examples:

  • Super Duper Company Pte Ltd Web Site (across the entire site): Templating, a very bad way to use titles
  • Super Duper Company Pte Ltd. The best company which only sells super, duper stuff like super duper cars, super duper trucks, super duper whatever, all the super duper stuff you want!: Title spamming, definitely a bad way to use titles. Furthermore, search engines only index on average the first 64 characters of the title anyway.
  • Super Duper Company – Description of Current Page: This is one of the more popular title formats. Clear, descriptive, concise… However, there’s a better way to structure a title.
  • Description of Current Page – Super Duper Company: In my opinion, this is the best title structure a web site can use. The page description should appear before the company name because that is what a search result will show first. In addition, the title should focus more on the current content rather than the overall description of the web site.

Mistake 10: Using Popups for Important Pages

This is yet another scientifically proven web irritation that is used for all the wrong reasons. The common reasoning used by incompetent web designers is that popups shift the focus temporarily from the main content to the popup’s.

What this group of designers fail to see is that after years of being plagued by popups, even the most inexperienced web surfer developed a habit of automatically closing popups (especially unrequested ones) or enabling popup blocking features in their browsers. Hell, most modern browsers already enabled this option by default.

To make matters worse, a lot of corporate websites use popups to announce important news. Sadly, this information is unlikely to reach the targetted recepients, either through the “close all popups by reflex syndrome” or through popup blocking browser features/software.

Footnotes

I hope that you obtained some useful hints and tips on designing a more structurally sound web site from what I’ve written. The purpose of this article is not to preach. Hell, I’ve made almost all of the mistakes I’ve mentioned above (the only exception is mistake #1, probably because I can’t develop anything useful in Flash, let alone an entire web site).

I appreciate any comments and/or feedback. You may do so by using the comment form below, or if you prefer, you can send me private feedback via this contact form. I will strive to respond in a timely manner.