Tag: programming

Telcos are Attacking the Internet

I generally try to stay out of politics on this blog, but this time something has to be said, as it affects anyone who uses the internet, at least in the US.

Basically, a number of telcos and cable providers are talking about charging internet content providers -- the places you browse to on the internet, places like Google, Yahoo!, Amazon, etc. -- fees to ensure bandwidth to their sites. Their argument is that these content providers are getting a 'free ride' on their lines, and generating a lot of traffic themselves, and should thus be paying for the cost of bandwidth.

This is patently ridiculous. Content providers already have to pay for their bandwidth -- they, too, have ISPs or agreements with telcos in place, either explicitly or via their hosting providers. Sure, some of them, particularly search engines, send out robots in order to index or find content, but, again, they're paying for the bandwidth those robots generate. Additionally, people using the internet are typically paying for bandwidth as well, through their relationship with their ISP. What this amounts to is the telcos getting paid not just by each person to whom they provide internet access, but every end point on the internet, at least those within the US.

What this is really about is telcos wanting more money, and wanting to push their own content. As an example, let's say your ISP is AOL. AOL is part of Time Warner, and thus has ties to those media sources. Now, those media sources may put pressure on AOL to reduce bandwidth to sites operated by ABC, CBS, NBC, FOX, Disney, PBS, etc. This might mean that your kid can no longer visit the Sesame Street website reliably, because AOL has reduced the amount of bandwidth allowed to that service -- but any media site in the TWC would get optimal access, so they could get to Cartoon Network. Not to slam Cartoon Network (I love it), but would you rather have your kid visiting cartoonnetwork.com or pbskids.org? Basically, content providers would not need to compete based on the value of their content, but on who they can get to subscribe to their service.

Here's another idea: your ISP is MSN. You want to use Google... but MSN has limited the bandwidth to Google because it's a competitor, and won't accept any amount of money to increase that bandwidth. They do the same with Yahoo! So, now you're limited to MSN search, because that's the only one that responds reliably -- regardless of whether or not you like their search results. By doing so, they've just artificially inflated the value of their search engine -- without needing to compete based on merit.

Additionally, let's say Barnes and Noble has paid MSN to ensure good bandwidth, but part of that agreement is a non-compete clause. Now you find your connections to Amazon timing out, meaning that you can't even see which book provider has the better price on the book you want; you're stuck looking and buying from B&N.

Now, let's look at something a little more close to home for those of us developing web applications. There have been a number of success stories the last few years: MySpace, Digg, and Flickr all come to mind. Would these endeavors have been as successful had they needed to pay multiple times for bandwidth, once to their ISP and once each to each telco charging for content providers? Indeed, some of these are still free services -- how would they ever have been able to pay the extra amounts to the telcos in the first place?

So, basically, the only winners here are the telcos.

Considering how ludicrous this scheme is, one must be thinking, isn't the US Government going to step in and regulate against such behaviour? The answer, sadly, is no. The GOP doesn't like regulation, and so they want market forces to decide. Sadly, what this will likely do is force a number of content providers to offshore their internet operations -- which is likely to have some pretty negative effects on the economy.

The decision isn't final -- efforts can still be made to prevent it (the above link references a Senate committee meeting; there's been no vote on it). Call your representatives today and give them an earful. Tell them it's not just about regulation of the industry, but about fair competition in the market. Allowing the telcos to extort money from content providers will only reduce the US' economic chances in the world, and stifle innovation and choice.

Continue reading...

XP + Cygwin + coLinux == Productivity

I wrote earlier of my experiences using Windows XP, a move I've considered somewhat unfortunate but necessary. I've added a couple more tools to my toolbox since that have made the environment even better.

Continue reading...

Using Windows XP

Those of you who know me well know that I've been using Linux as my primary OS for many years now. At this point, for me, using Windows is like deciding I'm going to use a limited vocabulary; I can get the idea across, but not quite as well.

Due to the nature of where I work and the fact that I'm telecommuting, I've been having to maintain a dual-boot system. I use Ubuntu for my daily OS, and boot into Windows when I need to interact with people at work via Webex or Skype (we're using the new Skype beta with video, and it only works on Windows XP at this time).

This week, however, I've had to stay on Windows quite a bit -- lots of impromptu conference calls and such. So, I've been customizing my environment, and been pretty pleased with the results.

Continue reading for some tips on customizing your Windows XP environment to work and feel a little more like... linux.

Continue reading...


Life is in transition for me now. Two weeks ago, we got to bring our handsome baby boy home, and I haven't been sleeping much since (though more than Jen). On top of the sleep deprivation, however, comes more exciting news: I've been hired as a PHP Developer by Zend Technologies!

I was approached by Daniel Kushner in late July regarding another position at Zend, and was flown out at the beginning of August. While I felt the interview went well, I harbored some doubts; work got fairly busy shortly thereafter, and then, of course, Liam was born, and the interview went completely out of my head. Until about three days after Liam's birthday, when Daniel contacted me again about the PHP Developer position.

Work started yesterday, and I was flown to Zend's offices in Cupertino, CA, for orientation and to sit down with both Daniel and others to prepare for the projects on which I will be working. Thankfully, the job will not require that I move, and I will be working out of the 'home office' in Vermont when I return later this week.

The decision to leave NGA was difficult, but the opportunity to work with Zend is just too good to miss. I am honored to be selected by them, and hope this is the beginning of many good things to come.

Continue reading...

New Cgiapp Site

I've been extremely busy at work, and will continue to be through the end of March. I realized this past week that I'd set a goal of having a SourceForge website up and running for Cgiapp by the end of January -- and it's now mid-February. Originally, I was going to backport some of my libraries from PHP5 to PHP4 so I could do so... and I think that was beginning to daunt me a little.

Fortunately, I ran across a quick-and-dirty content management solution yesterday called Gunther. It does templating in Smarty, and uses a wiki-esque syntax for markup -- though page editing is limited to admin users only (something I was looking for). I decided to try it out, and within an hour or so had a working site ready to upload.

Cgiapp's new site can be found at cgiapp.sourceforge.net.


Shortly after I wrote this original post, I figured out what the strength of Gunther was -- and why I no longer needed it. Gunther was basically taking content entered from a form and then inserting that content (after some processing for wiki-like syntax) into a Smarty template. Which meant that I could do the same thing with Cgiapp and Text_Wiki. Within an hour, I wrote an application module in Cgiapp that did just that, and am proud to say that the Cgiapp website is 100% Cgiapp.

Continue reading...

Dreamweaver, Text Editors, and Webmasters

I picked up on this article on Friday, glanced through it and moved on, but noticed this evening it had been slashdotted -- at which point I realized the author is the current CGI::Application maintainer, so I looked again.

At my first glance through it, it appeared the author was looking for a nice, easy-to-use pre-processor script for generating a site out of templates and content files. To that end, he, in the end, recommended ttree, part of the Template Toolkit distribution.

However, the real gist of the article -- something that should probably have been summarized at the end -- is that the author was looking for an free and OSS replacement for DreamWeaver's Templates functionality. This functionality allows a developer to create a template with placeholders for content, lock it, and then create pages that have the bits and pieces of content. Finally, the developer compiles the site -- creating final HTML pages out of the content merged with the templates.

Now, I can see something like this being useful. I've used webmake for a couple of projects, and, obviously, utilize PHP in many places as a templating language. However, several comments on Slashdot also gave some pause. The tone of these comments was to the effect of, "real developers shouldn't use DW; they should understand HTML and code it directly." Part of me felt this was elitist -- the web is such an egalitarian medium that there should be few barriers to entry. However, the webmaster in me -- the professional who gets paid each pay period and makes a living off the web -- also agreed with this substantially.

I've worked -- both professionally and as a freelancer -- with individuals who use and rely on DW. The problem I see with the tool and others of its breed is precisely their empowerment of people. Let me explain.

I really do feel anybody should be able to have a presence on the 'net. However, HTML is a fragile language: trivial errors can cause tremendous changes in how a page is rendered -- and even crash browsers on occasion. The problem I see is that DW and other GUI webpage applications create, from my viewpoint, garbage HTML. I cannot tell you how many pages generated by these applications that I've had to clean up and reformat. They spuriously add tags, often around empty content, that are simply unnecessary.

The problem is compounded when individuals have neither time nor inclination to learn HTML, but continue using the tool to create pages. They get a false sense of accomplishment -- that can be quickly followed by a very real sense of incompetence when the page inexplicably breaks due to an edit they've made -- especially when the content is part of a larger framework that includes other files. Of course, as a web professional, I get paid to fix such mistakes. But I feel that this does everybody a disservice -- the individual/company has basically paid twice for the presentation of content -- once to the person generating it, a second time to me to fix the errors.

This is a big part of the reason why I've been leaning more and more heavily on database-driven web applications. Content then goes into the database, and contains minimal -- if any -- markup. It is then injected into templates, which go through a formal review process, as well as through the W3C validators, to prevent display problems. This puts everybody in a position of strength: the editor generating content, the designer creating the look-and-feel, and the programmer developing the methods for mapping content with the templates.

There's still a part of me that struggles with what I perceive as an elitist position. However, there's another part of me that has struggled heavily with the multitasking demands made on all web professionals -- we're expected to be editors, graphic designers, programmers, and more. In most cases, we're lucky if we're strong in one or two such areas, much less passionate about staying abreast of the changing face of our medium.

Continue reading...

Sign of a Geek

It's now been confirmed: I'm a geek.

Okay, so that probably comes as no shocker to those of you who know me, but it's the little things that make me realize it myself.

I've been frequenting Perl Monks for a couple of years now, mainly to garner ideas and code to help me with my personal or work projects. I rarely post comments, and I've only once submitted a question to the site. However, I do frequent the site regularly, and the few comments I've put in -- generally regarding usage of CGI::Application -- have been typically well-moderated.

Well, yesterday I made a comment to a user asking about editors to use with perl. I was incensed by a remark he made about VIM not having the features he needed. Now, as I said in my comment, I've used VIM on a daily basis for over two years, and I'm still discovering new features -- and I've used all of the features he was looking for.

This is where I discovered I'm a geek: my comment made it into the Daily Best for today, peaking around number 5. The fact that that made my day indicates to me that I must be a geek.

Oh -- and VIM rules!

Continue reading...

MySQL Miscellanae

Inspired by a Slashdot book review of High Performance MySQL.

I've often suspected that I'm not a SQL guru... little things like being self taught and having virtually no resources for learning it. This has been confirmed to a large degree at work, where our DBA has taught me many tricks about databases: indexing, when to use DISTINCT, how and when to do JOINs, and the magic of TEMPORARY TABLEs. I now feel fairly competent, though far from being an expert -- I certainly don't know much about how to tune a server for MySQL, or tuning MySQL for performance.

Last year around this time, we needed to replace our MySQL server, and I got handed the job of getting the data from the old one onto the new. At the time, I looked into replication, and from there discovered about binary copies of a data store. I started using this as a way to backup data, instead of periodic mysqldumps.

One thing I've often wondered since: would replication be a good way to do backups? It seems like it would, but I haven't investigated. One post on the aforementioned Slashdot article addressed this, with the following summary:

  1. Set up replication
  2. Do a locked table backup on the slave

Concise and to the point. I only wish I had a spare server on which to implement it!

Continue reading...

Cgiapp Roadmap

I've had a few people contact me indicating interest in Cgiapp, and I've noticed a number of subscribers to the freshmeat project I've setup. In addition, we're using the library extensively at the National Gardening Association in developing our new site (the current site is using a mixture of ASP and Tango, with several newer applications using PHP). I've also been monitoring the CGI::Application mailing list. As a result of all this activity, I've decided I need to develop a roadmap for Cgiapp.

Currently, planned changes include:

  • Version 1.x series:
    • Adding a Smarty registration for stripslashes (the Smarty "function" call will be sslashes).
    • param() bugfix: currently, calling param() with no arguments simply gives you a list of parameters registered with the method, but not their values; this will be fixed.
    • error_mode() method. The CGI::Application ML brought up and implemented the idea of an error_mode() method to register an error_mode with the object (similar to run_modes()). While non-essential, it would offer a standard, built-in hook for error handling.
    • $PATH_INFO traversing. Again, on the CGI::App ML, a request was brought up for built-in support for using $PATH_INFO to determine the run mode. Basically, you would pass a parameter indicating which location in the $PATH_INFO string holds the run mode.
    • DocBook tutorials. I feel that too much information is given in the class-level documentation, and that usage tutorials need to be written. Since I'm documenting with PhpDoc and targetting PEAR, moving tutorials into DocBook is a logical step.
  • Version 2.x series:
    Yes, a Cgiapp2 is in the future. There are a few changes that are either necessitating (a) PHP5, or (b) API changes. In keeping with PEAR guidelines, I'll rename the module Cgiapp2 so as not to break applications designed for Cgiapp.

    Changes expected include:
    • Inherit from PEAR. This will allow for some built in error handling, among other things. I suspect that this will tie in with the error_mode(), and may also deprecate croak() and carp().
    • Changes to tmpl_path() and load_tmpl(). In the perl version, you would instantiate a template using load_tmpl(), assign your variables to it, and then do your fetch() on it. So, this:
      $this->tmpl_assign('var1', 'val1');
      $body = $this->load_tmpl('template.html');
      Becomes this:
      $tmpl = $this->load_tmpl();
      $tmpl->assign('var1', 'val1');
      $body = $tmpl->fetch('template.html');
      $tmpl = $this->load_tmpl('template.html');
      $tmpl->assign('var1', 'val1');
      $body = $tmpl->fetch();
      (Both examples assume use of Smarty.) I want to revert to this behaviour for several reasons:
      • Portability with perl. This is one area in which the PHP and perl versions differ greatly; going to the perl way makes porting classes between the two languages simpler.
      • Decoupling. The current set of template methods create an object as a parameter of the application object - which is fine, unless the template object instantiator returns an object of a different kind.
      • Smarty can use the same object to fill multiple templates, and the current methods make use of this. By assigning the template object locally to each method, this could be lost.

        HOWEVER... an easy work-around would be for load_tmpl() to create the object and store it an a parameter; subsequent calls would return the same object reference. The difficulty then would be if load_tmpl() assumed a template name would be passed. However, even in CGI::App, you decide on a template engine and design for that engine; there is never an assumption that template engines should be swappable.
      • Existing Cgiapp1 applications would need to be rewritten.
    • Plugin Architecture: The CGI::App ML has produced a ::Plugin namespace that utilizes a common plugin architecture. The way it is done in perl is through some magic of namespaces and export routines... both of which are, notably, missing from PHP.

      However, I think I may know a workaround for this, if I use PHP5: the magic __call() overloader method.

      My idea is to have plugin classes register methods that should be accessible by a Cgiapp-based class a special key in the $_GLOBALS array. Then, the __call() method would check the key for registered methods; if one is found matching a method requested, that method is called (using call_user_func()), with the Cgiapp-based object reference as the first reference. Voila! instant plugins!

      Why do this? A library of 'standard' plugins could then be created, such as:
      • A form validation plugin
      • Alternate template engines as plugins (instead of overriding the tmpl_* methods)
      • An authorization plugin
      Since the 'exported' methods would have access to the Cgiapp object, they could even register objects or parameters with it.

If you have any requests or comments on the roadmap, please feel free to contact me.

Continue reading...

New site is up!

The new weierophinney.net/matthew/ site is now up and running!

The site has been many months in planning, and about a month or so in actual coding. I have written the site in, instead of flatfiles, PHP, so as to:

  • Allow easier updating (it includes its own content management system
  • Include a blog for my web development and IT interests
  • Allow site searching (everything is an article or download)

I've written it using a strict MVC model, which means that I have libraries for accessing and manipulating the database; all displays are template driven (meaning I can create them with plain-old HTML); and I can create customizable applications out of various controller libraries. I've called this concoction Dragonfly.

There will be more developments coming -- sitewide search comes to mind, as well as RSS feeds for the blog and downloads.

Stay Tuned!

Continue reading...