I want to use alasql library in script lab for sql queries
I have used browser link for alasql in library
Related
I add a lot of links to the document (Google Docs) every day. Then I click "Replace URL with its title" for each link. It is very uncomfortable.
Is it possible to automate this process using a script or something else? Please help me
Yes it is possible!
What you can do in this situation is to gather all the links you have in an array and then use string manipulation (such as regEx for example) in order to extract the title from the link.
For this task you can benefit from Apps Script - Apps Script is a powerful development platform which can be used to build web apps and automate tasks. What makes it special is the fact that it is easy to use and to create applications that integrate with G Suite.
Reference
Google Apps Script;
Text Class Apps Script;
Document Class Apps Script;
Document Class Apps Script - replaceText(searchPattern, replacement).
I want fetch the table details from https://www.edelweiss.in/market/nse-option-chain in a gogle sheet, I had used the formula
=IMPORTHTML("https://www.edelweiss.in/market/nse-option-chain","table",1) , but getting the empty error.
Unfortunately, the =IMPORTHTML cannot be used because the website you're trying to get the data from load the data dynamically.
Possible solution
A possible solution to gather the data might be found if you follow these steps:
1. Identify the the URL that is making the request with the data you want
You can do this by using the Google Chrome Console and checking the Network Tab and filtering the requests by XHR.
2. Import the IMPORTJSON library and use Apps Script
Install the library by accessing this link here. You should install the library by using Apps Script and going to Tools -> Script Editor -> add the ImportJSON.gs.
3. Use the formula in your Sheet
You could use the formula like this:
=ImportJSON("THE_LINK_FOUND_AT_STEP_1")
Reference
ImportJSON Library
I would like to read pdf files into the pipeline. However, I haven't found any apache beam example regarding file formats other than plain text or xml.
There is no pre-existing PDF reader available in Dataflow or Apache Beam libraries. However, you could use the example of this reader for TensorFlow records as a model to write your own using the PDF parsing library of your choice.
https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/io/TFRecordIO.java
The neo4j example data does not work in the most recent version 2.1.2 of neo4j as documented here and here. They are built for 1.9 and apparently can only be upgraded to 2.0, not 2.1. Is there a way to extract the raw data as a csv or cypher file (with a bunch of CREATE statements) from either the tarballs (which contain a graph.db, which I do not understand) or the associated github repositories (which I also do not understand)?
It's up to us to update the datasets, thanks for pointing it out, I think we'll also put them into versioned subdirectories, so it is easier to see which version they are in.
In general for small enough datasets, you can use the neo4j-shell and the "dump" command to generate cypher statements.
Also in the neo4j browser you can actually download the query results as CSV.
I also wrote a set of helper tools for the neo4j shell that allows you to export and import data as csv, graphml and other formats.
As of now the workflow is something like, I import an SVN or a CVS repository and then compile a document locally on my machine to get either a ps or a pdf file. But I was wondering if there is a Web front-end to do all the stuff, like for instance, an editor using which you can edit the file online and then download just the pdf file by compiling it?
Any suggestions?
http://www.scribtex.com/pages/index
http://code.google.com/p/latex-lab/
latex-lab will build on top of the google apps editor base...
scribtex is hosted only it looks like.
Another to add to the list is TeXonWeb.
If you mean online LaTeX compilers, then there are two I know of - at baywifi.com (to PDF) and at ScienceSoft (to several formats). Haven't seen any full editors, though.
There is a CMS based on Latex out there at www.osreviews.net.
The best site I found to produce PDF from LaTeX online is PC Shows.
Verbosus offers an Online LaTeX Editor that supports PDF preview, HTTPS, syntax highlighting, code completion, templates, etc. (Additionally it offers an editor for Octave/Matlab)
This is less of a web-based interface than a simple drag-and-drop cgi script that converts latex syntax to a graphic... www.forkosh.com/mimetex.html
latex-online is a simple open source web service that compiles latex sources/public git repos and returns pdf's. It has both a simplistic web front-end and a command-line tool for interacting with the service - you might find it interesting.
One rather new possibility is https://texlive.net/
You can either interactively edit your documents or you can pass your document via the url to it. E.g. a simple hello world document can be constructed as
https://texlive.net/run?%5Cdocumentclass%7Barticle%7D%0A%5Cusepackage%7Bamsmath%7D%0A%5Cbegin%7Bdocument%7D%0AHello%20world!%0A%5Cend%7Bdocument%7D