This wasn't really a conscious decision - when I set the site up, I started out just deleting the records when dogs were rehomed. But people who had been looking at the site wanted to know what had happened - it turned out that the people looking at it weren't just people who wanted to adopt dogs, but people who liked reading the stories of dogs, and they kept asking us for updates.
So, in a moment of weakness, I agreed that rather than deleting old records, we would move them to a different category, such as 'rehomed' or 'in loving memory' so people could see what happened next . This meant that almost all the dog details that have ever been added to the site are still on there somewhere. The Oldies Club has now been running for almost six years and lists dogs for several hundred rescues. So, there can be several new posts every day, each with several photos, and the whole thing has swollen to well over a gigabyte of data.
I don't write up and post all the day by day listings any more. That job is done by the Oldies Club Web team, a group of volunteers whose job it is to take the basic submissions we get through from the rescues, write them up in proper English with as much detail as we can find, chase up the rescues for the information they forgot to include, resize and prepare photos of the dogs and generally work to present the dogs as clearly, accurately and attractively as we possibly can.
It's far too late now to start deleting stuff, that horse has bolted - and anyway, it is a promotional advantage to have piles of people visiting regularly, as they all show their relatives and friends, and sometimes get tempted to adopt when they hadn't intended to. And it's a search engine advantage to have so many thousands of pages on the site, too.
So, I had to move it all. Downloading and uploading it all again would be a horrible job and awfully wasteful of bandwidth too. Instead, I installed the XCloner plugin which takes all the files and the database belonging to a Wordpress website, bundles them all into a giant tar file, and can then FTP it direct to the new location and install it again. It took aaaagges to tar it all, and more aaaages to move it all to a new location, but it nobly struggled on and after I left it overnight to get on with it, it finally completed that job.
The next task was to unpack the files and put them into place at the new location. XCloner handled this admirably (including updating the config file, which I wasn't expecting) until it came to time to unpack the database, when it sadly complained that it had run out of memory, and stopped. I was not defeated. I adjusted the amount of memory available to php, and tried again. And again. Every time it tried to use more memory, and stalled.
There is an 'incremental sql' option in XCloner for moving large databases, but this didn't want to work for me. So, I tried using Xcloner to make a copy of my database without the ginormous posts table. That worked OK, so now I had a working website, but no posts or pages. Almost there. Then I tried a backup with just the posts table. Nope, still running out of memory.
It occurred to me that I could manually untar the backup that just had the problem table in it, then go in via command line access and use mysql > source database-sql.sql and get the table installed that way.
I must admit, I find commandline access a bit of a fiddle - I don't use it very often so I can never remember half the commands and have to keep looking things up - but after a couple of tries I got there.
Kudos to Xcloner, I got a very fast response to my request for help on their forum, and they have said they will have a look at my backup for me and see if they can see why it keeps demanding so much memory. Can't ask for better support than that, particularly for free!