Get information from Yahoo Finance - google-sheets

I want to get the Market Cap value from this site using importxml:
https://finance.yahoo.com/quote/KIT.OL?p=KIT.OL&.tsrc=fin-srch
Where it says 3.217B.
I am using this to get the "previous close" value:
=ImportXML("https://sg.finance.yahoo.com/quote/"&B3&"/history?p="&B3; "//tbody/tr[1]/td[6]")
I was hoping I could just adjust the above formula to get te market cap value. Anyone who can help?
Thanks!

try:
=INDEX(IMPORTXML(A1, "//tr"), 9, 2)
or:
=INDEX(QUERY(TO_TEXT(IMPORTXML(A1, "//tr")),
"select Col2 where Col1 = 'Market Cap'", 0))
however!
this way you can get only the old value. to get the new one you will need to use a script:
function YAHOO(url) {
const res = UrlFetchApp.fetch(url, {muteHttpExceptions: true});
const tables = [...res.getContentText().matchAll(/(<table[\w\s\S]+?<\/table>)/g)];
if (tables.length < 2) return "No tables. Please confirm URL again.";
const values = tables.reduce((ar, [,table]) => {
if (table) {
const root = XmlService.parse(table).getRootElement();
const temp = root.getChild("tbody", root.getNamespace()).getChildren().map(e => e.getChildren().map(f => isNaN(f.getValue()) ? f.getValue() : Number(f.getValue())));
ar = ar.concat(temp);
}
return ar;
}, []);
return values[0].map((_, i) => values.map(r => r[i]));
}
and formula:
=INDEX(YAHOO(A1), 2, 9)
extra reading: https://stackoverflow.com/a/65914858/5632629

You could try using the full XPath:
=IMPORTXML(A1,"/html/body/div[1]/div/div/div[1]/div/div[3]/div[1]/div/div[1]/div/div/div/div[2]/div[2]/table/tbody/tr[1]/td[2]/span")
Or you could try a vlookup():
=vlookup("Market Cap", IMPORTXML(A1,"//tr"),2,0)

Related

how to get live query other spreadsheet with fastest way?

The job process is like below:
Use column A as a condition to query the spreadsheet DATA, and return the column * of the spreadsheet DATA.
But now my spreadsheet is facing the delay issue and lag problem ,
i have confuse how to settle it.
if is any wrong please forgive me
please check out the example sheets any suggestion is welcome,
and thanks at all
I believe your goal is as follows.
You want to reduce the process cost for retrieving your goal.
In your situation, how about using Google Apps Script as a direction? I thought that when Google Apps Script is used, the process cost might be able to be reduced. When Google Apps Script is used for your situation, it becomes as follows.
Sample script:
Please copy and paste the following script to the script editor of Google Spreadsheet and save the script. And, when you use this script using your provided Spreadsheet, please put a custom function of =SAMPLE('INPUT COL B'!B2:B,'DATA'!W2:AF) to a cell. By this, the result is returned.
function SAMPLE(srcValues, dataValues) {
const obj = dataValues.reduce((o, [w, ...v]) => {
const last = v.pop();
if (v.join("") != "") {
v.forEach(c => {
if (!o[c]) o[c] = [w, last];
});
}
return o;
}, {});
return srcValues.map(([b]) => obj[b] || [null, null]);
}
Testing:
When this script is for your provided Spreadsheet, the following result is obtained.
Note:
When the data becomes larger, the custom function might not be able to be used. At that time, please run the script by the script editor, custom menu, a button on Spreadsheet, and so on. The script is as follows. In this case, please copy and paste the following script to the script editor of Spreadsheet and save the script. And please run the function with the script editor. By this, in this script, the result value is put to the column "E" of "INPUT COL B" sheet.
function myFunction() {
const ss = SpreadsheetApp.getActiveSpreadsheet();
const [srcSheet, dataSheet] = ["INPUT COL B", "DATA"].map(s => ss.getSheetByName(s));
const srcValues = srcSheet.getRange("B2:B" + srcSheet.getLastRow()).getValues();
const dataValues = dataSheet.getRange("W2:AF" + dataSheet.getLastRow()).getValues();
const obj = dataValues.reduce((o, [w, ...v]) => {
const last = v.pop();
if (v.join("") != "") {
v.forEach(c => {
if (!o[c]) o[c] = [w, last];
});
}
return o;
}, {});
const res = srcValues.map(([b]) => obj[b] || [null, null]);
srcSheet.getRange(2, 5, res.length, res[0].length).setValues(res);
}
Reference:
Custom Functions in Google Sheets

while using header option with XLSX.utils.json_to_sheet , headers not overriding

I'm trying to change header titles by passing an array of titles to options but it does not override the headers. Instead it inserts new headers before the original data. I am passing the same numbers of header titles.
Here is my code:
const ws: XLSX.WorkSheet = XLSX.utils.json_to_sheet(
json,
{header: headerColumns}
);
const wb: XLSX.WorkBook = XLSX.utils.book_new();
XLSX.utils.book_append_sheet(wb, ws, 'Transactions');
const excelBuffer: any = XLSX.write(wb, { bookType: 'xlsx', type: 'array' });
this.saveAsExcelFile(excelBuffer, excelFileName);
And output looks like below:
The basic job of the "header" option is not to override, rather just shift the starting option of the columns.
i.e. any value passed in the header option will be treated as the first column, provided the value should match with existing keys you have in the data.
XLSX.utils.json_to_sheet([{A:1,B:2}, {B:2,C:3}], {header:['C']});
Here column "C" will be the first column in the excel.
For more look out for detailed description here: https://docs.sheetjs.com/#sheetjs-js-xlsx
This is how I have achieved similar behavior:
const XLSX = require('xlsx');
const wb = XLSX.utils.book_new();
const Heading = [
['Sr No', 'User Name', 'Department', 'Bank', 'Country', 'Region', 'Amount']
];
// creating sheet and adding data from 2nd row of column A.
// leaving first row to add Heading
const ws = XLSX.utils.json_to_sheet(data, { origin: 'A2', skipHeader: true });
// adding heading to the first row of the created sheet.
// sheet already have contents from above statement.
XLSX.utils.sheet_add_aoa(ws, Heading, { origin: 'A1' });
// appending sheet with a name
XLSX.utils.book_append_sheet(wb, ws, 'Records');
const fileContent = XLSX.write(wb, { bookType: 'xlsx', type: 'buffer' });
Very traditional approach but working, please see complete code below:
const worksheet: XLSX.WorkSheet = XLSX.utils.json_to_sheet(
this.releaseDateWiseCountList
);
worksheet.A1.v = "Pick Release Date";
worksheet.B1.v = "Task Type";
worksheet.C1.v = "First Shift";
worksheet.D1.v = "Second Shift";
worksheet.E1.v = "Total";
worksheet.F1.v = "Grand Total";
worksheet.G1.v = "Pick %";
const workbook: XLSX.WorkBook = {
Sheets: { 'data': worksheet }, SheetNames: ['data']
};
const excelBuffer: any = XLSX.write(
workbook, { bookType: 'xlsx', type: 'array' }
);
const data: Blob = new Blob([buffer], {type: EXCEL_TYPE});
FileSaver.saveAs(data, 'Result_export_' + new Date().getTime() + EXCEL_EXTENSION);

3.5.1 error can't get just created node by id

EDIT: Just got the same behavior on 3.4
EDIT2: If I remove the disableLosslessIntegers from the connection the issue goes away, but the all integer numbers come back as {low: 20, high:0} type structures which breaks my entire application
The following code works fine on neo4j 3.3 using the 1.7.2 neo4j-driver for node:
import {v1 as neo4j} from 'neo4j-driver';
const url: string = process.env.COREDB_URL || '';
const user: string = process.env.COREDB_USERNAME || '';
const password: string = process.env.COREDB_PASSWORD || '';
const driver = neo4j.driver(url, neo4j.auth.basic(user, password), {disableLosslessIntegers: true});
let connection = driver.session()
async function go() {
let res = await connection.run(`create (b:Banana {tag: 'test'}) return b,id(b) as id`, {});
let b = res.records[0].get('b').properties
console.log('b',b)
let id = res.records[0].get('id')
console.log('id',id)
res = await connection.run(`MATCH (u) where id(u)=$id return u as id`, {id: id});
console.log(res.records)
let id2 = res.records[0].get('id').properties;
console.log('id2',id2)
}
go().then(() => console.log('done')).catch((e) => console.log(e.message))
it gives the following output:
> node tools\test-id.js
b { tag: 'test' }
id 1858404
[ Record {
keys: [ 'id' ],
length: 1,
_fields: [ [Node] ],
_fieldLookup: { id: 0 } } ]
id2 { tag: 'test' }
done
Under 3.5.1 it does not work. The second statement returns no records:
> node tools\test-id.js
b { tag: 'test' }
id 1856012
[]
Cannot read property 'get' of undefined
BTW, the reason I need to do the get by id right after the create is that I am using an apoc trigger to add things to the node after creation, and apoc triggers apparently run after the object is created and returned, so I need the second get to see the transformed node.
But, for this distilled example I removed the trigger from my DB to ensure that it was not causing the issue

How to clean/clear cache of a SubjectReplay instance in rxjs?

I have a SubjectReplay and I would like to reset it to no cache values so after this reset the next subscriber doesn't get the history?
Example
new subject replay
subject.next(1)
reset subject <- this question
subject.subscribe() // should NOT receive 1
how can I do this? I need the subject to be the same instance.
You may look at using a combination of special values and filter operator to get something close to what you are trying to achieve.
Let's make a simple case.
You want to replay just the last value and null is the special value representing the reset. The code would be
const rs = new ReplaySubject<any>(1); // replay the last value
const rsObs = rs.asObservable().pipe(filter(d => d !== null));
rs.next(1);
rs.next(2);
setTimeout(() => {
console.log('first subscription');
rsObs.subscribe(console.log) // logs 2 on the console
}, 10);
setTimeout(() => {
rs.next(null);
}, 20);
setTimeout(() => {
console.log('second subscription');
rsObs.subscribe(console.log) // nothing is logged
}, 30);
The best way that comes to my mind with your requirement
I need the subject to be the same instance
would be to have the following observables:
// This is your input
const source$: Observable<T>;
const proxy$ = new ReplaySubject<T>(n);
const reset$ = new BehaviorSubject<number>(0);
Now it's important that we hook up the following before you emit on source$:
source$.pipe(timestamp()).subscribe(proxy$);
Then, finally, you can expose your data like this:
const data$ = proxy$.pipe(
withLatestFrom(reset$),
filter(([timedItem, resetTimestamp]) => timedItem.timestamp > resetTimestamp),
map(([timedItem]) => timedItem.value),
);
You can now use reset$.next(+new Date()) to trigger the reset.
If you can make sure to provide timestamped values to source$, you can skip the proxy$.

AutoComplete does not kick in when first character is 0

Does anyone know why my AutoComplete is not working when the first character I type is 0 (zero)? For debugging purposes I setup my AC to just tell me a row was found or not, and it seems that for any character I type as the first character it tells me so, except 0. I have to type in a second character after 0 for it to kick in and start working. It's as if the minLength attribute is 2 when the first char is 0. Has anyone run into or heard of this and know how to fix it? Here is my code:
//AutoComplete code in question
$(function() {
var itemcode_ac = {
source: "/webservices/whs_bincodeAC.php",
select: function(event, ui) {
$('#txtBin').val(ui.item.value);
getWhsInfo();
},
minLength: 1
}
$('#txtBin').autocomplete(itemcode_ac);
});
whs_bincodeAC.php:
<?php
if(isset($_GET["term"]) && !empty($_GET["term"])) {
include_once $_SERVER['DOCUMENT_ROOT'].'/path/to/dbConnect.php';
$term = mysql_real_escape_string(trim($_GET["term"]));
//wildcard appended here for parameterized query (MySqli)
$term .= "%";
$query = "SELECT DISTINCT BinCode, ItemCode, ItemName, WhsCode, DataAsOfDate FROM whse_tbl
WHERE BinCode LIKE '$term' or ItemCode LIKE '$term' ORDER BY BinCode LIMIT 0, 10";
$res = mysql_query($query);
//This is the debug code I described above
/*if($row = mysql_fetch_assoc($res))
echo json_encode(array(array('value' => "is row")));
else
echo json_encode(array(array('value' => "no row")));
return;*/
$matches = array();
while($row = mysql_fetch_assoc($res))
{
$matches[] = array('value' => $row["BinCode"], 'label' => $row["BinCode"].' - '.$row["ItemCode"],
'name' => $row["ItemName"], 'whscode' => $row["WhsCode"], 'asOfDate' => $row["DataAsOfDate"]);
}
echo json_encode($matches);
}
?>
Note: My boss is having me use MySql and not MySqli extension for now.
Probably you're doing something like the following.
if(empty($_GET["bin"])) or if(!$_GET["bin"]) to check it's value.
But in this cases if bin is 0, the first case results in true and the second in false.
So use isset($_GET["bin"]) instead.

Resources