Deno: Processing tar archive results in checksum error (Standard Library) - tar

I would like to process a tar archive with help of tar.ts from Standard Library.
The archive can be written successfully to test.tar by following code:
import { Tar, Untar } from "https://deno.land/std/archive/tar.ts";
// create tar archive
const tar = new Tar();
const content = new TextEncoder().encode("hello tar world!");
await tar.append("output.txt", {
reader: new Deno.Buffer(content),
contentSize: content.byteLength,
});
await Deno.writeFile("./test.tar", tar.out);
However, reading the tar triggers an error:
error: Uncaught Error: checksum error
throw new Error("checksum error");
--------^
at Untar.extract (https://deno.land/std/archive/tar.ts:432:13)
at async file:///C:/Users/bela/Desktop/script/test.ts:23:16
The code:
// read from tar archive
const untar = new Untar(await Deno.open("./test.tar"));
const buf = new Deno.Buffer();
const result = await untar.extract(buf); // <-- this line triggers error
const untarText = new TextDecoder("utf-8").decode(buf.bytes());
Where did I miss a step?

You have to use tar.getReader() to get the correct tar content.
const tar = new Tar();
const content = new TextEncoder().encode("hello tar world!");
await tar.append("output.txt", {
reader: new Deno.Buffer(content),
contentSize: content.byteLength,
});
const writer = await Deno.open("./test.tar", { write: true, create: true });
await Deno.copy(tar.getReader(), writer);
const untar = new Untar(await Deno.open("./test.tar", { read: true }));
const buf = new Deno.Buffer();
const result = await untar.extract(buf); // <-- this line triggers error
const untarText = new TextDecoder("utf-8").decode(buf.bytes());
console.log(untarText);
tar.out is currently a zero-filled Uint8Array which appears to be a bug in the std code

Related

TypeError: provider.send is not a function in anchor

I'm looking at this https://book.solmeet.dev/notes/intro-to-anchor written according to the code in the textbook:
TypeError: provider.send is not a function when await provider.send()
import * as anchor from '#project-serum/anchor';
import { Program } from '#project-serum/anchor';
import { AnchorEscrow } from '../target/types/anchor_escrow';
import { PublicKey, SystemProgram, Transaction } from '#solana/web3.js';
import { TOKEN_PROGRAM_ID, Token } from "#solana/spl-token";
import { assert } from "chai";
describe('anchor-escrow', () => {
// Configure the client to use the local cluster.
const provider = new anchor.getProvider();
anchor.setProvider(provider);
const program = anchor.workspace.AnchorEscrow as Program<AnchorEscrow>;
let mintA = null;
let mintB = null;
let initializerTokenAccountA = null;
let initializerTokenAccountB = null;
let takerTokenAccountA = null;
let takerTokenAccountB = null;
let vault_account_pda = null;
let vault_account_bump = null;
let vault_authority_pda = null;
const takerAmount = 1000;
const initializerAmount = 500;
const escrowAccount = anchor.web3.Keypair.generate();
const payer = anchor.web3.Keypair.generate();
const mintAuthority = anchor.web3.Keypair.generate();
const initializerMainAccount = anchor.web3.Keypair.generate();
const takerMainAccount = anchor.web3.Keypair.generate();
it("Initialize program state", async () => {
// Airdropping tokens to a payer.
await provider.connection.confirmTransaction(
await provider.connection.requestAirdrop(payer.publicKey, 10000000000),
"confirmed"
);
// ⚠️ An error has occurred here
await provider.send(
(() => {
const tx = new Transaction();
tx.add(
SystemProgram.transfer({
fromPubkey: payer.publicKey,
toPubkey: initializerMainAccount.publicKey,
lamports: 100000000,
}),
SystemProgram.transfer({
fromPubkey: payer.publicKey,
toPubkey: takerMainAccount.publicKey,
lamports: 100000000,
})
);
return tx;
})(),
[payer]
);
});
then run anchor test
Warning: cargo-build-bpf is deprecated. Please, use cargo-build-sbf
cargo-build-bpf child: /Users/yjy/.local/share/solana/install/active_release/bin/cargo-build-sbf --arch bpf
Error: Function _ZN13anchor_escrow9__private8__global8exchange17hb34dfff05db149f5E Stack offset of 4096 exceeded max offset of 4096 by 0 bytes, please minimize large stack variables
Finished release [optimized] target(s) in 0.34s
Found a 'test' script in the Anchor.toml. Running it as a test suite!
Running test suite: "/Users/yjy/Documents/Code/solana/anchor-projects/anchor-escrow/Anchor.toml"
yarn run v1.22.18
warning package.json: No license field
$ /Users/yjy/Documents/Code/solana/anchor-projects/anchor-escrow/node_modules/.bin/ts-mocha -p ./tsconfig.json -t 1000000 'tests/**/*.ts'
anchor-escrow
1) Initialize program state
✔ Initialize escrow
✔ Exchange escrow state
✔ Initialize escrow and cancel escrow
3 passing (598ms)
1 failing
1) anchor-escrow
Initialize program state:
TypeError: provider.send is not a function
at /Users/yjy/Documents/Code/solana/anchor-projects/anchor-escrow/tests/anchor-escrow.ts:45:20
at Generator.next (<anonymous>)
at fulfilled (tests/anchor-escrow.ts:28:58)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
How can I solve this problem?
It looks like send doesn't exist, but you can use sendAndConfirm if you want to confirm the transction too, or sendAll if you just want to send. That takes an array of Transactions, so you can do:
const tx = new Transaction();
tx.add(
SystemProgram.transfer({
fromPubkey: payer.publicKey,
toPubkey: initializerMainAccount.publicKey,
lamports: 100000000,
}),
SystemProgram.transfer({
fromPubkey: payer.publicKey,
toPubkey: takerMainAccount.publicKey,
lamports: 100000000,
})
);
await provider.sendAll([{tx, signers: [payer]}]);

TauriJS writeBinaryFile cannot freeze array buffer views with elements

I work with TauriJS and try to modify a zip file with jszip and later save it with writeBinaryFile.
function saveFile(org_path, new_path, pack_format) {
var zip = new JSZip();
// get file
var org_file = await window.__TAURI__.fs.readBinaryFile(org_path);
await zip.loadAsync(org_file);
// edit file
var pack_json = await zip.file("pack.json").async("string");
pack_json = JSON.parse(pack_json);
pack_json.pack.pack_format = pack_format;
zip.file("pack.json", JSON.stringify(pack_json));
// save file
var array_zip = await zip.generateAsync({type:"uint8array"});
await window.__TAURI__.fs.writeBinaryFile(new_path, array_zip);
}
This is the code I currently have. The problem is that it gives the error Uncaught TypeError: Cannot freeze array buffer views with elements
I wasn't able to find a solution to this error, is it somehow possible to bring the zip file in the right format to save it?
I found a way to fix the problem on this page:
https://qdmana.com/2022/144/202205241127535226.html
This is my adjusted code:
function saveFile(org_path, new_path, pack_format) {
var zip = new JSZip();
// get file
var org_file = await window.__TAURI__.fs.readBinaryFile(org_path);
await zip.loadAsync(org_file);
// edit file
var pack_json = await zip.file("pack.json").async("string");
pack_json = JSON.parse(pack_json);
pack_json.pack.pack_format = pack_format;
zip.file("pack.json", JSON.stringify(pack_json));
// save file
zip.generateAsync({ type: 'blob' }).then((content) => {
var file = new FileReader();
file.readAsArrayBuffer(content);
file.onload = function (e) {
var fileU8A = new Uint8Array(e.target.result);
window.__TAURI__.fs.writeBinaryFile({ contents: fileU8A, path: new_path + ".zip" });
};
});
}

Can't send readable stream with Pinata SDK to IPFS - 400 bad request and Unhandled rejection

Due to the nature of my project I have a image dataURL (NOT an actual image file) that I am trying to upload to IPFS via Pinata SDK. I have converted the image dataURL into a buffer(array) and tried 2 different methods but none of them works. Here is my code:
SAMPLE 1
var myBlob = new Blob([new Uint8Array(myBuffer)]);
var myReadableStream = myBlob.stream()
pinata.pinFileToIPFS(myReadableStream)
ERROR: Unhandled Rejection (TypeError): source.on is not a function
SAMPLE 2
var myBlob = new Blob([new Uint8Array(myBuffer)]);
var myHeaders = new Headers();
myHeaders.append("pinata_api_key", "MY_KEY");
myHeaders.append("pinata_secret_api_key", "MY_SECRET_KEY");
var formdata = new FormData();
formdata.append("test", myBlob);
var requestOptions = {
method: 'POST',
headers: myHeaders,
body: formdata,
redirect: 'follow'
};
fetch("https://api.pinata.cloud/pinning/pinFileToIPFS", requestOptions)
.then(response => response.text())
.then(result => console.log('result',result))
.catch(error => console.log('error', error));
ERROR: 400 Bad Request, result {"error":"Unexpected field"}
with buffers things can be a little tricky. You'll need to format your request in a slightly different way.
I would take a look at this code snippet for an example of how somebody got this to work:
const pinataSDK = require("#pinata/sdk");
const pinata = pinataSDK(
"Pinata API Key",
"Pinata API Secret"
);
const { fs, vol } = require("memfs");
(async () => {
try {
const base64 = "base64 file string";
const buf = Buffer.from(base64, "base64");
memfs.writeFileSync("File Name", buf);
const read = vol.createReadStream("File Name");
const res = await pinata.pinFileToIPFS(read);
console.log(res);
} catch (error) {
console.log(error);
}
})();

How can I get a file checksum in Deno?

Just starting with Deno, I am trying to figure out how to calculate a binary file checksum. It seems to me that the problem is not with the methods provided by the hash module of the standard library, but with the file streaming method and/or the type of the chunks feeding the hash.update method.
I have been trying a few alternatives, related to file opening and chunk types,with no success. A simple example is in the following:
import {createHash} from "https://deno.land/std#0.80.0/hash/mod.ts";
const file= new File(["my_big_folder.tar.gz"], "./my_big_folder.tar.gz");
const iterator = file.stream() .getIterator();
const hash = createHash("md5");
for await( let chunk of iterator){
hash.update(chunk);
}
console.log(hash.toString()); //b35edd0be7acc21cae8490a17c545928
This code compiles and runs with no errors, pity that the result is different from what I get running the functions of the crypto module provided by node and the md5sum provided by linux coreutils. Any suggestion ?
nodejs code:
const crypto = require('crypto');
const fs = require('fs');
const hash = crypto.createHash('md5');
const file = './my_big_folder.tar.gz';
const stream = fs.ReadStream(file);
stream.on('data', data=> { hash.update(data); });
stream.on('end', ()=> {
console.log(hash.digest('hex')); //c18f5eac67656328f7c4ec5d0ef5b96f
});
The same result in bash:
$ md5sum ./my_big_folder.tar.gz
$ c18f5eac67656328f7c4ec5d0ef5b96f ./my_big_folder.tar.gz
on Windows 10 this can be used:
CertUtil -hashfile ./my_big_folder.tar.gz md5
The File API isn't used to read a File in Deno, to do that you need to use the Deno.open API and then turn it into an iterable like this
import {createHash} from "https://deno.land/std#0.80.0/hash/mod.ts";
const hash = createHash("md5");
const file = await Deno.open(new URL(
"./BigFile.tar.gz",
import.meta.url, //This is needed cause JavaScript paths are relative to main script not current file
));
for await (const chunk of Deno.iter(file)) {
hash.update(chunk);
}
console.log(hash.toString());
Deno.close(file.rid);
import { crypto, toHashString } from 'https://deno.land/std#0.176.0/crypto/mod.ts';
const getFileBuffer = (filePath: string) => {
const file = Deno.openSync(filePath);
const buf = new Uint8Array(file.statSync().size);
file.readSync(buf);
file.close();
return buf;
};
const getMd5OfBuffer = (data: BufferSource) => toHashString(crypto.subtle.digestSync('MD5', data));
export const getFileMd5 = (filePath: string) => getMd5OfBuffer(getFileBuffer(filePath));

How to upload file in angular 2

This is the function I am using to upload file but is is giving me the error : Length is undefined. what I have to change in this code. where to give path of file to upload.
fileChange(event) {
let fileList: FileList = event.target.files;
if(fileList) {
let file: File = fileList[0];
let formData:FormData = new FormData();
formData.append('uploadFile', file, file.name);
let headers = new Headers();
/** No need to include Content-Type in Angular 4 */
headers.append('Content-Type', 'multipart/form-data');
headers.append('Accept', 'application/json');
let options = new RequestOptions({ headers: headers });
this.http.post(`assets/Files/info.txt`, formData, options)
.map(res => res.json())
.catch(error => Observable.throw(error))
.subscribe(
data => console.log(fileList),
error => console.log(error)
)
}
}
you need to use xhr request to transfer files
fileChange(event: EventTarget) {
let eventObj: MSInputMethodContext = <MSInputMethodContext> event;
let target: HTMLInputElement = <HTMLInputElement> eventObj.target;
let files: FileList = target.files;
if(files) {
let file: File = files[0];
this.upload(file)
}
}
public upload(filedata: File) {
let url = 'your url'
if (typeof filedata != 'undefined') {
return new Promise((resolve, reject) => {
let formData: any = new FormData();
let xhr = new XMLHttpRequest();
formData.append('icondata', filedata, filedata.name);
xhr.open('POST', url, true);
xhr.setRequestHeader('Authorization', 'JWT ' + localStorage.getItem('id_token'));
xhr.send(formData);
xhr.onreadystatechange = function () {
if (xhr.readyState == XMLHttpRequest.DONE) {
resolve(JSON.parse(xhr.responseText));
}
}
});
}
}
I understand that this is not the functionality you want to have but with no backend you can not upload files to be persistent, they should be stored somewhere. If you just wanna manipulate file names for instance, skip the express part in my answer. I personally used this code which I altered to upload multiple files.
In your Component :
import {FormArray, FormBuilder, FormControl, FormGroup} from "#angular/forms";
declare FormBuilder in the constructor:
constructor (private http: Http, private fb: FormBuilder) {}
in ngOnInit() set a variable as follows :
this.myForm = this.fb.group({chosenfiles: this.fb.array([])});
this is the code for the upload method :
// invoke the upload to server method
// TODO
// Should be in a service (injectable)
upload() {
const formData: any = new FormData();
const files: Array<File> = this.filesToUpload;
//console.log(files);
const chosenf = <FormArray> this.myForm.controls["chosenfiles"];
// iterate over the number of files
for(let i =0; i < files.length; i++){
formData.append("uploads[]", files[i], files[i]['name']);
// store file name in an array
chosenf.push(new FormControl(files[i]['name']));
}
this.http.post('http://localhost:3003/api/upload', formData)
.map(files => files.json())
.subscribe(files => console.log('upload completed, files are : ', files));
}
the method responsible for the file change :
fileChangeEvent(fileInput: any) {
this.filesToUpload = <Array<File>>fileInput.target.files;
const formData: any = new FormData();
const files: Array<File> = this.filesToUpload;
console.log(files);
const chosenf = <FormArray> this.myForm.controls["chosenfiles"];
// iterate over the number of files
for(let i =0; i < files.length; i++){
formData.append("uploads[]", files[i], files[i]['name']);
// store file name in an array
chosenf.push(new FormControl(files[i]['name']));
}
}
Template is something like this
<input id="cin" name="cin" type="file" (change)="fileChangeEvent($event)" placeholder="Upload ..." multiple/>
Notice multiple responsible for allowing multiple selections
The express API which will handle the request uses multer after an npm install
var multer = require('multer');
var path = require('path');
specify a static directory which will hold the files
// specify the folder
app.use(express.static(path.join(__dirname, 'uploads')));
As specified by multer
PS: I did not investigate multer, as soon as i got it working, i moved to another task but feel free to remove unnecessary code.
var storage = multer.diskStorage({
// destination
destination: function (req, file, cb) {
cb(null, './uploads/')
},
filename: function (req, file, cb) {
cb(null, file.originalname);
}
});
var upload = multer({ storage: storage });
And finally the endpoint
app.post("/api/upload", upload.array("uploads[]", 12), function (req, res) {
console.log('files', req.files);
res.send(req.files);
});

Resources