Thread: Migration from MySQL to PostgreSQL : Datatypes?
Hello,
What would be the best way to deal with migrating a database structure and data from a mysql db to postgres when there are dataypes in the mysql tables that are not present, and thus throw errors using the current migration wizard, in postgres?
If i leave these few tables out on first import, many of the other tables that these few key back to dont get created either.
I found a program called easyfrom, but it's over $200 and only runs on windows. Navicat, which i have, doesnt seem to have any provision to change the datatypes on structure sync from one to the other types of db.
Any suggestions would be great.
thanks.
What would be the best way to deal with migrating a database structure and data from a mysql db to postgres when there are dataypes in the mysql tables that are not present, and thus throw errors using the current migration wizard, in postgres?
If i leave these few tables out on first import, many of the other tables that these few key back to dont get created either.
I found a program called easyfrom, but it's over $200 and only runs on windows. Navicat, which i have, doesnt seem to have any provision to change the datatypes on structure sync from one to the other types of db.
Any suggestions would be great.
thanks.
On Thu, Dec 23, 2010 at 9:12 AM, Bill P. <maxarbos@yahoo.com> wrote: > I found a program called easyfrom, but it's over $200 and only runs on > windows. Navicat, which i have, doesnt seem to have any provision to change > the datatypes on structure sync from one to the other types of db. > > Any suggestions would be great. > thanks. It sounds like you need a tool to extract, transform, and then load (ETL) from Mysql to Postgresql. There are many application that will do this for you. Pentaho:Kettle - is a java application that should work for you: http://sourceforge.net/projects/pentaho/ -- Regards, Richard Broersma Jr.
On Thursday 23 December 2010 19:12:18 Bill P. wrote: > Hello, > > What would be the best way to deal with migrating a database structure and > data from a mysql db to postgres when there are dataypes in the mysql > tables that are not present, and thus throw errors using the current > migration wizard, in postgres? > > If i leave these few tables out on first import, many of the other tables > that these few key back to dont get created either. > > I found a program called easyfrom, but it's over $200 and only runs on > windows. Navicat, which i have, doesnt seem to have any provision to > change the datatypes on structure sync from one to the other types of db. > > Any suggestions would be great. > thanks. > Hello, I don't know about the best way but I have recently used this perl script quite succesfully. It's 'old' but nevertheless, it worked. http://pgfoundry.org/projects/mysql2pgsql/ BR and Merry Christmas! -- Aarni Ruuhimäki -------------- This is a bug-free broadcast from Ubuntu 9.10 Karmic Koala Linux System PROUD TO BE 100% Microsoft FREE!
Bill P. wrote: > Hello, > > What would be the best way to deal with migrating a database structure > and data from a mysql db to postgres when there are dataypes in the > mysql tables that are not present, and thus throw errors using the > current migration wizard, in postgres? > > If i leave these few tables out on first import, many of the other > tables that these few key back to dont get created either. > > I found a program called easyfrom, but it's over $200 and only runs on > windows. Navicat, which i have, doesnt seem to have any provision to > change the datatypes on structure sync from one to the other types of db. > > Any suggestions would be great. > thanks. > > I don't know how complex your schema is, but I would create the objects manually and then use Perl scripts to copy the data. Copying shouldn't be a problem because MySQL supports things like this: SELECT * INTO OUTFILE '/tmp/emp.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM emp That should be picked up quite nicely by the COPY command from PostgreSQL. -- Mladen Gogala Sr. Oracle DBA 1500 Broadway New York, NY 10036 (212) 329-5251 http://www.vmsinfo.com The Leader in Integrated Media Intelligence Solutions
Thanks for all the responses so far.
I can get the data in it seems with no problem, but I refuse to believe that I need create all the tables manually. Most of the tables will import and convert just fine, so oI dont think there is a need to do that.
As for the ETL suggestion. I have been using talend to migrate and transform data imports, but the actual migration from mysql to postgres table structures are not something I am aware it can do.
I guess my main question would be is: is there a best practice to migrate db/schema tables of mysql to schema tables in postgres without having to manually do them one at a time?
A migration tool like the one provided by enterprisedb would be great, IF it allowed me to change or fix errors as they occur instead of the entire process exiting.
Thanks.
From: Mladen Gogala <mladen.gogala@vmsinfo.com>
To: Bill P. <maxarbos@yahoo.com>
Cc: "pgsql-novice@postgresql.org" <pgsql-novice@postgresql.org>
Sent: Thu, December 23, 2010 12:50:10 PM
Subject: Re: [NOVICE] Migration from MySQL to PostgreSQL : Datatypes?
Bill P. wrote:
> Hello,
>
> What would be the best way to deal with migrating a database structure and data from a mysql db to postgres when there are dataypes in the mysql tables that are not present, and thus throw errors using the current migration wizard, in postgres?
>
> If i leave these few tables out on first import, many of the other tables that these few key back to dont get created either.
>
> I found a program called easyfrom, but it's over $200 and only runs on windows. Navicat, which i have, doesnt seem to have any provision to change the datatypes on structure sync from one to the other types of db.
>
> Any suggestions would be great.
> thanks.
>
>
I don't know how complex your schema is, but I would create the objects manually and then use Perl scripts to copy the data. Copying shouldn't be a problem because MySQL supports things like this:
SELECT *
INTO OUTFILE '/tmp/emp.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM emp
That should be picked up quite nicely by the COPY command from PostgreSQL.
--
Mladen Gogala Sr. Oracle DBA
1500 Broadway
New York, NY 10036
(212) 329-5251
http://www.vmsinfo.com The Leader in Integrated Media Intelligence Solutions
I can get the data in it seems with no problem, but I refuse to believe that I need create all the tables manually. Most of the tables will import and convert just fine, so oI dont think there is a need to do that.
As for the ETL suggestion. I have been using talend to migrate and transform data imports, but the actual migration from mysql to postgres table structures are not something I am aware it can do.
I guess my main question would be is: is there a best practice to migrate db/schema tables of mysql to schema tables in postgres without having to manually do them one at a time?
A migration tool like the one provided by enterprisedb would be great, IF it allowed me to change or fix errors as they occur instead of the entire process exiting.
Thanks.
From: Mladen Gogala <mladen.gogala@vmsinfo.com>
To: Bill P. <maxarbos@yahoo.com>
Cc: "pgsql-novice@postgresql.org" <pgsql-novice@postgresql.org>
Sent: Thu, December 23, 2010 12:50:10 PM
Subject: Re: [NOVICE] Migration from MySQL to PostgreSQL : Datatypes?
Bill P. wrote:
> Hello,
>
> What would be the best way to deal with migrating a database structure and data from a mysql db to postgres when there are dataypes in the mysql tables that are not present, and thus throw errors using the current migration wizard, in postgres?
>
> If i leave these few tables out on first import, many of the other tables that these few key back to dont get created either.
>
> I found a program called easyfrom, but it's over $200 and only runs on windows. Navicat, which i have, doesnt seem to have any provision to change the datatypes on structure sync from one to the other types of db.
>
> Any suggestions would be great.
> thanks.
>
>
I don't know how complex your schema is, but I would create the objects manually and then use Perl scripts to copy the data. Copying shouldn't be a problem because MySQL supports things like this:
SELECT *
INTO OUTFILE '/tmp/emp.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM emp
That should be picked up quite nicely by the COPY command from PostgreSQL.
--
Mladen Gogala Sr. Oracle DBA
1500 Broadway
New York, NY 10036
(212) 329-5251
http://www.vmsinfo.com The Leader in Integrated Media Intelligence Solutions
On Thu, Dec 23, 2010 at 3:48 PM, Bill P. <maxarbos@yahoo.com> wrote: > Thanks for all the responses so far. > > I can get the data in it seems with no problem, but I refuse to believe that > I need create all the tables manually. Most of the tables will import and > convert just fine, so oI dont think there is a need to do that. > > As for the ETL suggestion. I have been using talend to migrate and transform > data imports, but the actual migration from mysql to postgres table > structures are not something I am aware it can do. Some object-relational-manager libraries (Hibernate, SQLAlchemy, etc.) could probably be used to automate much of the process. Sean > I guess my main question would be is: is there a best practice to migrate > db/schema tables of mysql to schema tables in postgres without having to > manually do them one at a time? > A migration tool like the one provided by enterprisedb would be great, IF it > allowed me to change or fix errors as they occur instead of the entire > process exiting. > > Thanks. > > > ________________________________ > From: Mladen Gogala <mladen.gogala@vmsinfo.com> > To: Bill P. <maxarbos@yahoo.com> > Cc: "pgsql-novice@postgresql.org" <pgsql-novice@postgresql.org> > Sent: Thu, December 23, 2010 12:50:10 PM > Subject: Re: [NOVICE] Migration from MySQL to PostgreSQL : Datatypes? > > Bill P. wrote: >> Hello, >> >> What would be the best way to deal with migrating a database structure and >> data from a mysql db to postgres when there are dataypes in the mysql tables >> that are not present, and thus throw errors using the current migration >> wizard, in postgres? >> >> If i leave these few tables out on first import, many of the other tables >> that these few key back to dont get created either. >> >> I found a program called easyfrom, but it's over $200 and only runs on >> windows. Navicat, which i have, doesnt seem to have any provision to change >> the datatypes on structure sync from one to the other types of db. >> >> Any suggestions would be great. >> thanks. >> >> > I don't know how complex your schema is, but I would create the objects > manually and then use Perl scripts to copy the data. Copying shouldn't be a > problem because MySQL supports things like this: > > SELECT * > INTO OUTFILE '/tmp/emp.csv' > FIELDS TERMINATED BY ',' > ENCLOSED BY '"' > LINES TERMINATED BY '\n' > FROM emp > > That should be picked up quite nicely by the COPY command from PostgreSQL. > > > -- > Mladen Gogala Sr. Oracle DBA > 1500 Broadway > New York, NY 10036 > (212) 329-5251 > http://www.vmsinfo.com The Leader in Integrated Media Intelligence Solutions > > > > >
Bill P. wrote: > Thanks for all the responses so far. > > I can get the data in it seems with no problem, but I refuse to > believe that I need create all the tables manually Actually, you don't, but the trick is not for the faint hearted. I did migrate one MySQL schema to Postgres, but used Oracle RDBMS as an intermediate step. Oracle Corp. has a great tool for migrating MySQL to Oracle. Once there, you can use a phenomenal ora2pg script to migrate the schema from Oracle --> PostgreSQL. Oracle RDBMS is, of course, not free. -- Mladen Gogala Sr. Oracle DBA 1500 Broadway New York, NY 10036 (212) 329-5251 www.vmsinfo.com
Hello, > the schema from Oracle --> PostgreSQL. Oracle RDBMS is, of course, not > free. In case data volume is not high (more than 4 GB or thereabout), you could use Oracle Express Edition. http://www.oracle.com/technetwork/database/express-edition/overview/index.html Regards, Jayadevan DISCLAIMER: "The information in this e-mail and any attachment is intended only for the person to whom it is addressed and may contain confidential and/or privileged material. If you have received this e-mail in error, kindly contact the sender and destroy all copies of the original communication. IBS makes no warranty, express or implied, nor guarantees the accuracy, adequacy or completeness of the information contained in this email or any attachment and is not liable for any errors, defects, omissions, viruses or for resultant loss or damage, if any, direct or indirect."
On 2010-12-23, Bill P. <maxarbos@yahoo.com> wrote: > --0-1401805872-1293124338=:85107 > Content-Type: text/plain; charset=us-ascii > > Hello, > > What would be the best way to deal with migrating a database structure and data > from a mysql db to postgres when there are dataypes in the mysql tables that are > not present, and thus throw errors using the current migration wizard, in > postgres? > > If i leave these few tables out on first import, many of the other tables that > these few key back to dont get created either. > > I found a program called easyfrom, but it's over $200 and only runs on windows. > Navicat, which i have, doesnt seem to have any provision to change the datatypes > on structure sync from one to the other types of db. > > Any suggestions would be great. > thanks. when I had to migrate an (mdbtools) Access dump. I was able to do all that was needed using sed. (But if you don't speak regex that's probably not going to work for you, it should be equally possible with any general purpose programming language) Is there any sort of guarantee, for that $200 tool, paying for it might be better than reinventing your own. That translator will probably work fine under wine. Ask the vendor, if they don't know, offer to test it for them: you might get a discount! -- ⚂⚃ 100% natural
Mladen Gogala wrote: > Oracle RDBMS is, of course, not free. No "of course" about it. You are mistaken. Oracle RDBMS is, too, free, in the Express Edition (XE) configuration. I do not know offhand if their MySQL migration tool is included in that configuration. <http://www.oracle.com/us/products/database/express-edition/index.html> No DBMS is free if you look beyond license fees. -- Lew Ceci n'est pas une pipe.
Jasen Betts wrote: > On 2010-12-23, Bill P. <maxarbos@yahoo.com> wrote: >> --0-1401805872-1293124338=:85107 >> Content-Type: text/plain; charset=us-ascii >> >> Hello, >> >> What would be the best way to deal with migrating a database structure and data >> from a mysql db to postgres when there are dataypes in the mysql tables that are >> not present, and thus throw errors using the current migration wizard, in >> postgres? >> >> If i leave these few tables out on first import, many of the other tables that >> these few key back to dont get created either. >> >> I found a program called easyfrom, but it's over $200 and only runs on windows. >> Navicat, which i have, doesnt seem to have any provision to change the datatypes >> on structure sync from one to the other types of db. >> >> Any suggestions would be great. >> thanks. > > when I had to migrate an (mdbtools) Access dump. I was able to do all > that was needed using sed. > > (But if you don't speak regex that's probably not going to work for > you, it should be equally possible with any general purpose > programming language) > > Is there any sort of guarantee, for that $200 tool, paying for it > might be better than reinventing your own. That translator will > probably work fine under wine. Ask the vendor, if they don't know, > offer to test it for them: you might get a discount! There are a number of tools available to translate schema between different engines, both FLOSS and commercial. The simple problem with all of them is that they make some basic assumptions about how to convert those data types that don't have a direct correspondence between the two engines. Those assumptions will *not* produce an efficient schema for the new engine in all cases. Furthermore, even if there is a comparable type, there is no guarantee that the actual use of the individual fields will work well with the results. Therefore, it is always necessary to examine the translation offered to see if it will actually work in the new environment. For large volumes of data, trial conversions from a recent backup, with thorough functional and performance testing are an absolute requirement before committing production systems. We do all of that testing just for a major version upgrade even without a change in engines. We don't want our clients discovering problems after we update their servers. Bob McConnell N2SPP