The Million Dollar Weekend
When the development team at MSN said they didn't have time to get to it, I told MSN Rankings I could get it done in a weekend. And I did. I wrote a VB application (no C# yet) to publish directly to Inktomi's engine using their custom protocol. This greatly increased data throughput and, according to the team's monitoring software, increased referral revenue by $1 million in the first 6 months of its use. Cha-Ching!
Upper management was very pleased. In fact, MSN's VP asked why I was making the dev team look bad in a meeting the next week (for this position I wasn't part of the official dev team). Unfortunately soon after there was a re-org and the development manager that said they didn't have time to code this solution took over leadership of my position and I had to end my contract quickly. :s
Research Without Data?
With the MSN Research team I was hired to basically watch a process and make sure it continued running. The executable was written by 2 developers over one
year. The process transferred files very quickly, but was a single threaded windows application that crashed regularly. This was causing problems getting the
terabytes worth of data from the MSN log servers to the Research teams' COSMOS cluster. In fact they were months behind.
As I normally do, I analyzed the existing situation and proposed a solution. I developed a multithreaded, distributed processing Windows service to schedule, report, and process the MSN logs. I created the
system in about 3 months, during which time I kept the old system running. The new system ran on 5 servers, each receiving their jobs/steps from a central SQL job server. There was also a web server from which users could
view the status of the jobs as well as request new jobs, and administrators could configure the application, restart the services, etc.
We were caught up within a couple weeks and continued to stay caught up so long as the MSN servers and Cosmos cluster remained accessible.
Página de inicio rápido
What do you do when your company page has too much data to pull? Try AJAX! I updated Keynote Systems customers' login page to pull the data asynchronously, improving perceived performance
and reducing the time required to load the page by a factor of ten.
Put Your Money Where Your Mouth Is
Here's the deal. I hate B.S. If I say something can be done, it can be done. The Exchange Center of Excellence put me to the test.
They were looking to pull data from their database and expose it in a Word document, then print it as a read-only secure PDF. They typically pulled the
data and generated the document by hand, then saved it as a PDF. I had a short discussion with the team about how this could easily be automated. There was
some disagreement, and I mentioned I could do it in one night. My manager said to do it.
Ok, so I underestimated the time required by a couple hours. The next afternoon I delivered a fully automated SQL to Word to PDF report generator. The Doubting Thomas said nothing, but
my manager was quite pleased.
|