converting PHP to HTML

Hello everyone.

At work I have been asked to maintain an online newsletter for my division. To make updating easier, I intend to build a simple PHP/MySQL driven site to display articles in a blog-like style.

The problem is the destination server does not support any server-side language.

So my question is, can I build the PHP/MySQL driven site on my local computer, then somehow automate the process to compile each dynamically generated page into HTML. The idea is that I maintain the site on my computer, then every day/week/month generate the latest version of the site to upload to the server.

Thanks in advance!
 
hello Jacob.

Yes I realise I can do that, but it's not very efficient with hundreds of pages. Plus it means I have to manually change the links that allow the user to navigate to each page.
 
Isn't it just easier to get them to enable PHP? If you're on Apache all you have to do is uncomment a few lines in the config file.
 
Assuming you can't enable PHP, and assuming that the remote server is a Unix/Linux variant, you should look into setting up an rsync scenario and automating that via cron.

You could also set up two scripts, local and remote, to automate an archive/transfer/unarchive scenario.

For instance, on your machine -- in basically correct sh syntax:

cd /Library/WebServer/Documents/mynews
/usr/bin/tar czf `date +%Y-%m-%d`.tgz *
/usr/bin/scp `date +%Y-%m-%d`.tgz me@remoteserver.com:/path/to/www/`date +%Y-%m-%d`.tgz

This would create the archive, name it after today's date, e.g., 2007-03-13.tgz, and transfer it to htdocs on the remote machine. You could write a script to run on the remote server at some good witching hour -- say, 4 AM -- to find this file, untar it, and delete it.
 
billbaloney: that would only copy the files as is without "compiling" the php.

My suggestion is using wget to automatically download the whole site by recursively following all links. It's the -r option iirc.

What would definitely be easier would be to get a cheap web hosting plan with php and mysql ;)
 
billbaloney: that would only copy the files as is without "compiling" the php.

Not if the directory /Library/WebServer/Documents/mynews contains the output of the PHP files, which was the assumption in that scenario.

What I'm detailing is basically the same process that standard blog software, e.g., Blogger, Movable Type, uses to (a) output flat HTML files and (b) move those files, if requested, across a network.

Actually, TTC, your should consider using a free Movable Type installation to generate your blog as flat HTML on your local machine, and then develop a process to transfer the updated site to the remote server.
 
thanks so much for all your advice. your suggestions set me on a path to find my solution: SiteSucker.

With this app I can suck my localhost version of the site and, provided I tell it to localise the content, will save a cache of my site as separate files! At the end I still need to change extensions form .php to .htm, but otherwise the relative links still function.

Thanks again. I would not have come across the solution had it not been for your suggestions :)
 
Back
Top