File processing from a local drive for contiki-os - contiki

Please have a look at the following code. I am writing an application to process a file from a local host among different other processes for sky mote. Contiki cfs_open cannot open a file from a local drive. The goal is to open a file from a local drive and store in the flash (sky) for live streaming; streaming part is working. Any suggestions for file uploading using CFS.
PROCESS(coffee_file_process, "Coffee file process");
AUTOSTART_PROCESSES(&coffee_file_process);
PROCESS_THREAD(coffee_file_process, ev, data)
{
PROCESS_BEGIN();
char buf[100];
int fd;
fd=cfs_open("cate.txt", CFS_READ);
if(fd >= 0) {
cfs_write(fd, buf, sizeof(buf));
cfs_seek(fd, 0, CFS_SEEK_SET); //compute the offset from the beginning of the file.
cfs_read(fd, buf, sizeof(buf));
printf("Read message: %s\n", buf);
cfs_close(fd);
}
PROCESS_END();
}

If you are simulating using Cooja you can script it. On the real node there is no way past the serial. Either use the shell as suggested by Sarwarul or write your own serial to cfs dumper.

Related

Reading characters within a file (the fseek command) not working in nxc (not-exactly-c) programming

Basically I'm trying to write code that reads a specified value from a file ex: file contains(12345) the code reads specific place (say the 3rd place) and the output = 3
I already know how to write numbers to a file but I am stuck on how to read a character because the fseek command won't work (it doesn't know what the command is). I downloaded the include folder with all the extra commands but it still couldn't find the file "stdio.h" in the code.
I don't completely know how it all works anyway
I am a complete noob in programing and so I only know the very basic stuff. I am 90% sure it is my fault it isn't working.
#include "cstdio.h." //gets error (doesn't recognize command/file)
task main ()
{
byte handle;
int fsize = 100;
int in;
int sus = 12345;
DeleteFile("int.txt");
CreateFile("int.txt",100,handle);
Write(handle, sus);
CloseFile(handle);
OpenFileRead("int.txt",fsize,handle);
fseek(handle, 10, 3); //gets error (doesn't recognize command)
in = fgetc(handle);
ClearScreen();
NumOut(30,LCD_LINE5,in);
Wait(100000);
CloseFile(handle);
}

FileIsExist() not finding file

I am having an Expert Advisor (EA) look for the file "File.txt".
The file was created by a python program.
I can see the file in the file explorer.
The path to the file is
C:\Users\AppData\Roaming\MetaQuotes\Terminal\Common\Files.
The error code for the FileIsExist() function is 5020(ERR_FILE_NOT_EXIST).
Why does it not recognize the file? Is it looking in another directory?
while(!FileIsExist("File.txt", 0)){
if(FileIsExist("File.txt", 0))
printf("in while loop, waiting for file");
else{
int iErr = GetLastError();
printf(iErr);
}
}
If your file is in 'Common' folder, use the corresponding flag.
bool exist=FileIsExist(filename,FILE_COMMON);
What is the idea of your code? if file does not exsist - sleep for a while (10ms) then check again

Ionic/Cordova File plugin fails with to write file when downloading a large number of documents in succession on iOS device

The problem as the user sees it:
User has long list of documents they need to download to iOS device (200+)
User starts the download, with each file downloaded in succession.
At the end of the download queue, they discover that one of the files fails (and its always one of two specific files that are 25MB+)
They retry the job (which only downloads the failed document) and it succeeds
What I'm seeing as a developer:
My app pulls down the document as a blob
When I inspect the blob (within my Typescript app code), it has a size > 0
I call this.file.writeFile(directoryPath, fileName, blob, {replace: true}), which calls the Ionic File wrapper around Cordova File Plugin
However, when I look at the blob in write of FileWriter.js, it has a size of zero
This all results in the error spitting out as:
{"type":"error","bubbles":false,"cancelBubble":false,"cancelable":false,"lengthComputable":false,"loaded":0,"total":0,"target":{"fileName":"","length":0,"localURL":"cdvfile://localhost/persistent/downloaded-assets/9ce34f8a-6201-4023-9f5b-de6133bd5699/{{redacted}}","position":0,"readyState":2,"result":null,"error":{},"onwritestart":null,"onprogress":null,"onwriteend":null,"onabort":null}}
What I'm getting from this is that somewhere between calling file.writeFile on the Ionic File wrapper in my typescript code and the FileWriter.write method in the Cordova package, my blob is getting corrupted, lost, or emptied somehow.
It's difficult to debug as that layer in-between these two points is minified in the xCode debugger, so it would also be nice to hear some suggestions about how I might be able to debug this myself better.
Do we have any idea what might be going on here? Is it a memory issue on iOS? Does Cordova timeout over multiple repeated requests?
A few things to note:
The full list of files download fine every time on I try within the iOS xCode simulator. This is leading me to believe it might be a memory issue, but I'm not sure.
The failure always happens after about 200 files, and on one of two files that are 25-30MB+
As far as debugging goes, the earliest I can see my blob reduced to 0 is here https://github.com/apache/cordova-plugin-file/blob/4a92bbbea755aa9e5bf8cfd160fb9da1cd3287cd/www/FileWriter.js#L107 (though I might be debugging incorrectly)
EDIT - After a little more digging, I was able to see exactly where the Ionic plugin went wrong:
The code I used:
private writeFileInChunks(writer: FileWriter, file: Blob) {
console.log('SIZE OF FILE AT START', file.size);
const BLOCK_SIZE = 1024 * 1024;
let writtenSize = 0;
function writeNextChunk() {
const size = Math.min(BLOCK_SIZE, file.size - writtenSize);
console.log('CALCULATED SIZE:', size);
const chunk = file.slice(writtenSize, writtenSize + size);
console.log('SIZE OF CHUNK TO WRITE', chunk.size)
writtenSize += size;
writer.write(chunk);
}
return getPromise<any>((resolve, reject) => {
writer.onerror = reject as (event: ProgressEvent) => void;
writer.onwrite = () => {
if (writtenSize < file.size) {
writeNextChunk();
} else {
resolve();
}
};
writeNextChunk();
});
}
The output for the failed document:
SIZE OF FILE AT START: 34012899
CALCULATED SIZE: 1048576
SIZE OF CHUNK TO WRITE: 0
On retry:
SIZE OF FILE AT START: 34012899
CALCULATED SIZE: 1048576
SIZE OF CHUNK TO WRITE: 1048576
CALCULATED SIZE: 1048576
SIZE OF CHUNK TO WRITE: 1048576
...
...
...
CALCULATED SIZE: 458467
SIZE OF CHUNK TO WRITE: 458467
So for whatever reason, after a large number of previous downloads, that file.slice step results in an empty/corrupted blob.
Any ideas on how to correct this?
Ran another test with some expanded logging:
private writeFileInChunks(writer: FileWriter, file: Blob) {
...
function writeNextChunk() {
const size = Math.min(BLOCK_SIZE, file.size - writtenSize);
console.log('CALCULATED SIZE:', size);
console.log('WRITTEN SIZE', writtenSize);
console.log('SUMS TO:', writtenSize + size)
console.log('FILE SIZE BEFORE SLICE:', file.size);
const chunk = file.slice(writtenSize, writtenSize + size);
console.log('SIZE OF CHUNK TO WRITE', chunk.size);
writtenSize += size;
writer.write(chunk);
}
...
...
}
Output came to:
CALCULATED SIZE: 1048576
WRITTEN SIZE: 0
SUMS TO: 1048576
FILE SIZE BEFORE SLICE: 34012899
SIZE OF CHUNK TO WRITE 0
Further confirming the issue

C++ - read and write pcapng files without libpcap

I'm interested in reading and writing pcapng files without using libpcap or WinPcap. Anyone knows how to do it?
I can recommend a C-library which does that, it's called LightPcapNg and apparently PcapPlusPlus is using it to have a cleaner C++ wrapper.
Since you're interested in C++, here is a code snippet of how to read a pcap-ng file using PcapPlusPlus:
#include <PcapFileDevice.h>
void readAndWritePcapNg(char* inputFileName, char* outputFileName)
{
// reader instance
PcapNgFileReaderDevice readerDev(inputFileName);
// writer instance
PcapNgFileWriterDevice writerDev(outputFileName);
// open reader and writer
readerDev.open();
writerDev.open();
RawPacket rawPacket;
// read packets from file
while (readerDev.getNextPacket(rawPacket))
{
Packet packet(&rawPacket);
// do whatever you want with the packet
....
....
// write the packet to the output file
writerDev.writePacket(rawPacket);
}
// close reader and writer
readerDev.close();
writerDev.close();
}

Blackberry - Programmatically extract/open zip file

I have looked online with mixed results, but is there a way to programmatically extract a zip file on the BB? Very basic my app will display different encrypted file types, and those files are delivered in a zip file. My idea was to have the user browse to the file on their SDCard, select it, and I extract what i need as a stream from the file. is this possible?
Use GZIPInputStream
Example:
try
{
InputStream inputStream = httpConnection.openInputStream();
GZIPInputStream gzis = new GZIPInputStream(inputStream);
StringBuffer sb = new StringBuffer();
char c;
while ((c = (char)gzis.read()) != -1)
{
sb.append(c);
}
String data = sb.toString();
gzis.close();
}
catch(IOException ioe)
{
}
Just two things:
In BB API there are only GZip and ZLib support, and no multiple files compression support, so it's not possible to compress several files and extract only one of them.
Up to my experience, such functionality will fly on simulator, but may be really performance-killing on real device
See How to retrieve data from a attached zip file in Blackberry application?
PS Actually you can implement custom multi-entries stream and parse it after decompress, but that seems to be useless, if you want this archive format to be supported in other applications.

Resources