Another ASP question...
I've been working on my website a fair bit. Working on it locally works well for some things, but the google maps stuff only works if it's running on my real domain, so I have to get it up to the website.
I used "publish" to do that, which turns out to be a bit of a mistake. When you do this, VS says that it will delete files, and asks for confirmation. I assumed this was just about overwriting the files that were up there, but it's really a "scorched earth" approach which toasted every file on my website before deploying the application.
A day later and a few $$$ lighter, I had my content back off of backup.
So, what should I be doing? Publish is convenient in that it gets everything up there, but it takes roughly forever to do so, so it's not the solution I'd prefer. I've looked at "copy website", which might work, but I presume that I would have to a) figure out what I've changed and b) copy each file up to the server in the right place. Doing this correctly (ie getting all the files and any assemblies I use) up there is pretty tough, and I don't want a site that's sometimes published and sometimes web copied...
Is there a better way of doing what I want to do? Or is the presumption that I will do all my development locally?
Comments
- Anonymous
August 26, 2005
It may be tough in a windows environment (may need to use cygwin), but rsync is a very good tool. - Anonymous
August 26, 2005
If you have your stuff in versioning its easy because deploying it is as easy as getting the lastest stuff from your repository, just check in the assemblies and everything. This way, you dont need to keep track of whats changed. - Anonymous
August 26, 2005
I usually just zip up everything and just xcopy the files over the ones that currently exist on your real website. Only the files that are included in your ASP.NET Web project will be copied. It looks like the copy command will let you specify that you only want to copy those files that are in the project, and it will leace the non-project files on your website alone. - Anonymous
August 26, 2005
The comment has been removed - Anonymous
August 26, 2005
I noticed that while google maps works works only a real domain, Virtual Earth - http://virtualearth.msn.com/ works locally and does not require signing up for an API key. The developer resources are at http://www.viavirtualearth.com/vve/Dashboard/Default.ashx - Anonymous
August 27, 2005
True, the publish options in VS sucketh mightily. The easiest option for what you want (without installing additional software) is indeed Copy Website. Highlight all files and folders in your local hierarchy, and push the two-arrowed "Synchonize" button. It will (eventually) publish all the changed files but not the unchanged ones. - Anonymous
August 27, 2005
While developing locally, try editing your "Hosts" file so that your "Real" domain points to 127.0.0.1.
Maybe you can trick it into working while you dev.
If not, Unleashit works well despite its unfortunate name. - Anonymous
August 28, 2005
I've been using the VS 2005 Copy Web feature that shows which files have changed locally vs the server. It works great when you only want to replace a couple of files and not the whole site, which 2003 required.
I'm using your same hosting provider and the copy web connects via FTP. - Anonymous
August 29, 2005
Use subversion and "svn update" when you want to sync to a specific version of your code. - Anonymous
August 29, 2005
The comment has been removed - Anonymous
September 05, 2005
Adding a 127.0.0.1 hosts entry does work with the Google Maps API. Just remembering to change it back is the tricky part. - Anonymous
September 08, 2005
I just want to reiterate that your /etc/hosts file is the way to do GMaps development.