Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
LMagnus20635yYou're not a real developer until you've mastered the Access Db export's new line issue.
That brings back painful memories! -
Goof jon bro.
I did quite close jobs in my career, i had to rescue a website that was so archaic to maintain, and no way to export the database, as there where no database (old pure html website with tens on pages). So i dig into php and ended up doing a curl and domparsing, and saved everypage in a maysql table with the linked url.
Technically, my database was html !
And eventually saved all infos and wiped the fuck out of that site, and builded a brand new on yii 1.1 (back than it was something waw)... Than injected the table ive created into the new cms ive build. And voila, the new website is here, and the old data is here, i even created a map that redirected all old urls to the new corresponding one to avoid losing seo ranking.
Back than , i was the 4th developper to try to fix the situation and was the only one who were thinking enough out of the box to find a solution, yet simple, but in the blindspot of most developers -
good job
although I hope you backed up the export file, the scripts you used to convert it and documentation explaining what you did
and (if possible and feasible) a script that compares that no data was lost, maybe a row to row equality check.
data conversion can go wrong, and no one wants to lose sensitive data.
if I was you and didn't do either of those and just "eye checked" the data, i wouldn't exactly think i was a bad dev
but i wouldn't call myself a heckin professional either -
exerceo11922yDue to the risk of vendor lock-in, proprietary formats should be avoided where possible.
So a few days ago I felt pretty h*ckin professional.
I'm an intern and my job was to get the last 2003 server off the racks (It's a government job, so it's a wonder we only have one 2003 server left). The problem being that the service running on that server cannot just be placed on a new OS. It's some custom engineering document server that was built in 2003 on a 1995 tech stack and it had been abandoned for so long that it was apparently lost to time with no hope of recovery.
"Please redesign the system. Use a modern tech stack. Have at it, she's your project, do as you wish."
Music to my ears.
First challenge is getting the data off the old server. It's a 1995 .mdb file, so the most recent version of Access that would be able to open it is 2010.
Option two: There's an "export" button that literally just vomits all 16,644 records into a tab-delimited text file. Since this option didn't require scavenging up an old version of Access, I wrote a Python script to just read the export file.
And something like 30% of the records were invalid. Why? Well, one of the fields allowed for newline characters. This was an issue because records were separated by newline. So any record with a field containing newline became invalid.
Although, this did not stop me. Not even close. I figured it out and fixed it in about 10 minutes. All records read into the program without issue.
Next for designing the database. My stack is MySQL and NodeJS, which my supervisors approved of. There was a lot of data that looked like it would fit into an integer, but one or two odd records would have something like "1050b" which mean that just a few items prevented me from having as slick of a database design as I wanted. I designed the tables, about 18 columns per record, mostly varchar(64).
Next challenge was putting the exported data into the database. At first I thought of doing it record by record from my python script. Connect to the MySQL server and just iterate over all the data I had. But what I ended up actually doing was generating a .sql file and running that on the server. This took a few tries thanks to a lot of inconsistencies in the data, but eventually, I got all 16k records in the new database and I had never been so happy.
The next two hours were very productive, designing a front end which was very clean. I had just enough time to design a rough prototype that works totally off ajax requests. I want to keep it that way so that other services can contact this data, as it may be useful to have an engineering data API.
Anyways, that was my win story of the week. I was handed a challenge; an old, decaying server full of important data, and despite the hitches one might expect from archaic data, I was able to rescue every byte. I will probably be presenting my prototype to the higher ups in Engineering sometime this week.
Happy Algo!
rant