Import sas data into hadoop

We are buying third party survey data. They are providing us data in SAS format.

Source data format - SAS Frequency - Daily Data - Full one year data set (no delta)

We would like to bring this data into our Hadoop environment on daily basis. What are our options.

We asked them to send the data in text file. But their text file had 8650 columns (for ex. Country .. so they had 250 columns - one with each country). Our ETL tool failed to process that many columns. According to them it is mush easier to read data in SAS format.

Any suggestion..



The problem here is not a technology problem... It sounds like they are just being unhelpful. I do most of my work in SAS and would never provide someone with a table with that many columns and expect them to import it.

Even if they sent it in SAS format, the SAS dataset is still going to have the same number of columns and the ETL tool (even if it could read in SAS datasets - which is unlikely) is still likely to fail.

Tell them to transpose the data in SAS so that there are fewer columns and then to re-send it as a text file.

Thanks Everyone..

I think, this would solve my issue:

Need Your Help

Getting error: Conversion from string "" to type 'Date' is not valid date

If anyone can rewrite this for me I'd really appreciate it. I'm trying to change stored date if null. I've tried Me.vipEndDate ="0000-00-00" and much more.

ERROR: Cannot read property '_id' of undefined

node.js mongodb mongoose

I am trying to create a record using mongoose.js framework method model.Create as below example, i am geting an error as Cannot read property '_id' of undefined