Are you responsible for making OJS work -- installing, upgrading, migrating or troubleshooting? Do you think you've found a bug? Post in this forum.
Moderators: jmacgreg, btbell, michael, bdgregg, barbarah, asmecher
What to do if you have a technical problem with OJS:
1. Search the forum
. You can do this from the Advanced Search Page
or from our Google Custom Search
, which will search the entire PKP site. If you are encountering an error, we especially recommend
searching the forum for said error.
2. Check the FAQ
to see if your question or error has already been resolved.
3. Post a question
, but please, only after trying the above two solutions. If it's a workflow or usability question you should probably post to the OJS Editorial Support and Discussion
subforum; if you have a development question, try the OJS Development
We're migrating OJS from one server to another. Everything is running smoothly, but we have a problem with the database, actually with one table from the database
It is quite a big table (10 mb) and I cannot export it from one server to another (we use mysql). It looks like it stores articles keywords or something similar, and whilst it is only partially copied, it looks like everything is working fine.
Question is... is there any way to export a table in mysql in different pieces? is there a way to re-generate this table contents?
Any help will be appreciated. Greetings from Spain,
- Posts: 31
- Joined: Wed Aug 03, 2005 12:04 am
- Location: Spain
If I'm not mistaken, the article_search_object_keywords table is used to speed up the search engine, and it is built while using the system. I don't think it will affect the use.
However, you can dump parts of the database, instead of a full dump.
Having PHPMyAdmin (or PGPadmin for PostgeSQL databases) to view your databases via web makes it easier to export parts of the database as either zip or plain sql files.
Then, you can import them into the other server.
Via shell it may be more difficult but you should be able to accomplish that.
- Posts: 940
- Joined: Wed Oct 15, 2003 6:15 am
- Location: Brasília/DF - Brasil
I've moved extremely large databases from server to server using mysqldump and netcat as follows:
- On the sending server, run:
Enter the password and press <Enter>.
- Code: Select all
mysqldump -u (username) -p (database name) | gzip -c -9 - | nc -l -p 5000
- On the receiving server, run:
Enter the password and press <Enter>. The transfer should start.
- Code: Select all
nc (IP of sending server) 5000 | gunzip -c - | mysql -u (username) -p (database name)
However, the article_search_... tables don't need to be transferred; you can re-generate them with the tools/rebuildSearchIndex.php script.
Open Journal Systems Team
Don't miss the First International PKP Scholarly Publishing Conference
July 11 - 13, 2007, Vancouver, BC, Canada
- Posts: 9454
- Joined: Wed Aug 10, 2005 12:56 pm
Return to OJS Technical Support
Who is online
Users browsing this forum: Baidu [Spider], Google [Bot], Yahoo [Bot] and 7 guests