Map columns from excel to db in rails engine - ruby-on-rails

I am working on a rails engine which takes a excel file as an input and than saves it to temp folder in app. Than when i click on validate it validates the file and if there are no errors than it will save file to database. I am able to get the columns of excel file but my problem is that a excel file can have different columns than what in my excel file.
Also i am not generating any migration for engine. db table will be provided by the app and my engine will only parse data, validate and than if excel file is valid than it will save all the data in table.
I am getting confused how to set mapping such that it will save data to db.
For e.g. - My engine have model upload.rb where i am doing all validation for excel file and than saving it. Now suppose my app have a model employee.rb which have columns first name, last name, employee id etc. while my excel file can have any alias or different column name for data. e.g. fname, lname etc.
Suppose i have validation in my engine model, so my engine have no idea about mapping, it will only validate and save the data in db. So mapping needs to be set in my app.
Can anyone tell me how can i set mapping so that whenever my engine is mounted it will parse excel and save to to app's db table as per the mapping set ?
Any help will be appreciated.

Related

Azure Data Factory read url values from csv and copy to sql database

I am quite new in ADF so thats why i am asking you for any suggestions.
The use case:
I have a csv file which contains unique id and url's (see image below). i would like to use this file in order to export the value from various url's. In the second image you can see a example of the data from a url.
So in the current situation i am using each url and insert this manually as a source from the ADF Copy Activity task to export the data to a SQL DB. This is very time consuming method.
How can i create a ADF pipeline to use the csv file as a source, and that a copy activity use each row of the url and copy the data to Azure SQL DB? Do i need to add GetMetaData activity for example? so how?
Many thanks.
use a look up activity that reads all the data,Then use a foreach loop which reads line by line.Inside foreach use a copy activity where u can able to copy response to the sink.
In order to copy XML response of URL, we can use HTTP linked service with XML dataset. As #BeingReal said, Lookup activity should be used to refer the table which contains all the URLs and inside for each activity, Give the copy activity with HTTP as source and sink as per the requirement. I tried to repro the same in my environment. Below are the steps.
Lookup table with 3 URLs are taken as in below image.
For-each activity is added in sequence with Lookup activity.
Inside For-each, Copy activity is added. Source is given as HTTP linked service.
In HTTP linked service, base URL, #item().name is given. name is the column that stored URLs in the lookup table. Replace the name with the column name that you gave in lookup table.
In Sink, azure database is given. (Any sink of your requirement is to be used). Data is copied to SQL database.
this is the dataset HTTP inside the copy activity
This is the input of the Copy Activity inside the for each
this is the output of the Copy Activity
My sink is A Azure SQL Database without any tables yet. i would like to create auto table on the fly from ADF. Dont understand why this error came up

Event correlation in logstash(Need to join two files fields data into single json record)

I am having log data in the below path
/Download/aaaa/sysOne/one.txt
Error,12345,erredescriptionOne
Error,12345,erredescriptiontwo
Error,12345,erredescriptionthree
and I have one more file /Download/aaa/config/config.txt with below data
AppName,SystemName,DatabaseName
FirstName,sysOne,DBOne
SecondName,systwo,DBTwo
ThirdName,systhree,DBThree
FourthName,sysfour,DBFour
I want to load the data as a single json record in ES from both the files using sysOne value as reference.
I tried with add_field but it is not working
Could you please help me how to proceed on this.

Is it possible to read contents of a file uploaded with CarrierWave?

I followed this example, File uploads with CarrierWave to upload an excel file (Classlist) in my application. The file contains the student names, surnames and gender.
When uploading, User will give a name and subject for this "Class". Which will then be stored in MyClasses table. I want to know if I can read the contents of the file and save contents (names,surnames,gender) in the Student table?
The excel file uploaded looks like this...

Apache Solr: Merging documents from two sources before indexing

I need to index data from a custom application in Solr. The custom app stores metadata in an Oracle RDBMS and documents (PDF, MS Word, etc.) in a file store. The two are linked in the sense that the metadata in the database refers to a physical document (PDF) in the file store.
I am able to index the metadata from the RDBMS without issues. Now I would like to update the indexed documents with an additional field in which I can store the parsed content from the PDFs.
I have considered and tried the following
1. Using Update RequestHandler to try and update the indexed document with . This didn't work and the original document indexed from the RDBMS was overwritten.
2. Using SolrJ to do atomic updates but I am not sure if this is a good approach for something like this
Has anyone come across this issue before and what would be the recommended approach?
You can update the document, but it requires that you know the id of the existing document. For example:
{
"id": "5",
"parsed_content":{"set": "long text field with parsed content"}
}
Instead of just saying "parsed_content":"something" you have to wrap the value in "parsed_content":{"set":"something"} to trigger adding it to the existing document.
See https://wiki.apache.org/solr/UpdateXmlMessages#Optional_attributes_for_.22field.22 for documentation on how to work with multivalued fields etc.

mvc file upload and database insert

I'm just getting my head wrapped around MVC in .net using VS 2013.
I need a little direction in regards to uploading a file (which is easy) but also inserting data about that image into a database. Specifically I want to allow the end user to enter a description, title etc. about the file being uploaded. On the back-end I want to also add to the meta data a 'Date Created', 'Path to the file', 'Category', and the File Name and a couple other pieces of data that will help with presenting files in the views. I don't want to insert the files in the DB but just use the path in the generated HTML to point to the physical file so the end user can view or download it.
Multiple file types are being used, Audio, Video, Documents, Images.
I can get the file upload to work but writing the controller to accept a file, and end user input, then also add the other fields I need into the database that the user never sees is where I'm stuck. Blending the file upload with the user fields and beack end data is confusing me on how to get all the pieces to work together.
So in short getting File Upload + User Input + non-User Input values all in the same View, Controller, and Model is what I need direction on.
You have to upload your image plus data in a multi-part form.
In your view you will create a POST form that uses "multipart/form-data" encoding, you can then include inputs for model data and your file upload control within the body of the form. When it submits will will create a multi-part form, one part will contain the binary file and another part will contain your data.
On the controller action side you will receive the data with an action akin to
public ActionResult PostFile(MyModel model, HttpPostedFileBase file) {...}
There are numerous posts on SO with more details so I won't go into that.

Resources