Stream STDOUT to file - stream

I'm trying to stream STDOUT/:stdio into a file using a Task. I've tried variations on piping and Enum.each as well as putting each function in the parent process and none of these solutions produce any errors at compilation. On the other hand none of them write to my file either. Here's my current best guess for how this should work:
defmodule OutputWriter do
def start do
Task.start(fn -> stream() end)
end
defp stream do
Enum.each(IO.stream(:stdio, :line), fn line -> File.write(Path.join("test", "output.txt"), line) end)
end
end

Related

"function arguments expected near 'levelc'" when using LOVE

I'm currently trying to make a level loading system for a game.
function love.filedropped(file)
ofile=io.open(file:getFilename(),"r")
io.input(ofile)
file:close
levelc=io.read()
for i=1,levelc do
levels[i]=io.read()
print levels[i]
end
levelc should be the first line of the file, and file:getFilename is the file to open (path included) the project gives an error message on startup, and i've used a similar structure before, but for an output. The error is at line 30, which is the levelc=io.read().
I've tried changing the name of the file pointer (it was "f" before, now "ofile") and i've tried using io.read("*l") instead of io.read() but same result.
EDITS:
-this is a love.filedropped(file)
-i need to open other files from a .txt later and i don't really understand how do do that
The parameter given by love.filedropped is a DroppedFile.
In your case helpful could be File:lines().
For example:
function love.filedropped(file)
-- Open for reading
file:open("r")
-- Iterate over the lines
local i = 0
for line in file:lines() do
i = i + 1
levels[i] = line
print(i, levels[i]) -- Notice the parentheses missing in your code
end
-- Close the file
file:close()
end
Notice that love2d usually only allows reading/writing files within the save or working directory. Dropped files are an exception.
Unrelated to this answer but things I noticed in your code:
Use locals, oFile should be local
file:close() required parentheses as its a function call
Same for the print
The filedropped callback has no end
You mentioned reading other files too, to do so, you can either:
Use love.filesystem.newFile and a similar approach as before
The recommended one-liner love.filesystem.lines

Elixir/Erlang file_server message backlog and unreliable throughput causing performance issues

I'm running a production app that does a lot of I/O. Whenever the system gets flooded with new requests (with witch I do a ton of IO) I see the Erlang file_server backing up with messages. The backup/slowdown can last hours depending on our volume.
It's my understanding that a lot of File calls actually go through the Erlang file_server. Which appears to have limited throughput. Furthermore, When the message queue gets backed up the entire app is essentially frozen (locked up) and it cannot process new IO requests at all.
All of the IO calls are using the File module. I've specified the [:raw] option everywhere that will allow it. It's my understanding that passing in :raw will bypass the file_server.
This is a really big issue for us, and I imagine others have run into it at some point. I experimented with rewriting the IO logic in Ruby witch resulted in a massive gain in throughput (I don't have exact numbers, but it was a noticeable difference).
Anyone know what else I can look at doing to increase performance/throughput?
Sample Code:
defmodule MyModule.Ingestion.Insertion.Folder do
use MyModule.Poller
alias MyModule.Helpers
def perform() do
Logger.info("#{__MODULE__} starting check")
for path <- paths() do
files = Helpers.Path.list_files(path, ".json")
Task.async_stream(
files,
fn file ->
result =
file
|> File.read!()
|> Jason.decode()
case result do
{:ok, data} ->
file_created_at = Helpers.File.created_time(file)
data = Map.put(data, :file_created_at, file_created_at)
filename = Path.basename(file, ".json")
:ok = MyModule.InsertWorker.enqueue(%{data: data, filename: filename})
destination =
Application.fetch_env!(:my_application, :backups) <> filename <> ".json"
File.copy!(file, destination)
File.rm!(file)
_err ->
nil
end
end,
timeout: 60_000,
max_concurrency: 10
)
|> Stream.run()
end
Logger.info("#{__MODULE__} check finished")
end
def paths() do
path = Application.fetch_env!(:my_application, :lob_path)
[
path <> "postcards/",
path <> "letters/"
]
end
end
Consider tunning the VM with async_threads
For anyone finding this in the future. The root of the problem comes from using File.copy! with path names. When you do this, the copy will go through the Erlang file_server witch was the cause of a massive bottleneck that was extremely difficult to diagnose. Instead of using File.copy/1 with path names, use open files as the input. Like this
source = "path/to/some/file"
destination = "path/to/destination/"
with {:ok, source} <- File.open(source, [:raw, :read]),
{:ok, destination} <- File.open(destination, [:raw, :write]),
{:ok, bytes} <- File.copy(source, destination),
:ok <- File.close(source),
:ok <- File.close(destination) do
{:ok, bytes}
end

Reading a text file located on the computer with NodeMCU using Lua

My problem is about reading a text file (which is located in my computer) in the NodeMCU development kit. I am able to read the file content in Ubuntu terminal using a Lua script. Here I am sharing the code that I have been using for reading. Both are working pretty well in the Ubuntu terminal.
First one:
local open = io.open
local function read_file(path)
local file = open(path, "rb") -- r read mode and b binary mode
if not file then return nil end
local content = file:read "*a" -- *a or *all reads the whole file
file:close()
return content
Second one:
local fileContent = read_file("output.txt");
print (fileContent);
function file_exists(file)
local f = io.open(file, "rb")
if f then f:close() end
return f ~= nil
end
-- get all lines from a file, returns an empty
-- list/table if the file does not exist
function lines_from(file)
if not file_exists(file) then return {} end
lines = {}
for line in io.lines(file) do
lines[#lines + 1] = line
end
return lines
end
-- tests the functions above
local file = 'output.txt'
local lines = lines_from(file)
-- print all line numbers and their contents
for k,v in pairs(lines) do
print('line[' .. k .. ']', v)
end
My problem occurs when I have sent the code to NodeMCU, using Esplorer to send the code in. But the error occurs like this:
attempt to index global 'io' (a nil value)
stack traceback:
applicationhuff.lua:5: in function 'file_exists'
applicationhuff.lua:13: in function 'lines_from'
applicationhuff.lua:23: in main chunk
[C]: in function 'dofile'
stdin:1: in main chunk
My general purpose is actually to read this datas and publish it to a Mosquitto Broker via MQTT protocol. I am new to these topics. If anyone can handle my problem it will be appreciated. Thanks for your help...
NodeMCU does not have an io library. Therefore you get an error for indexing io, which is a nil value.
No offense, but sometimes I wonder how you guys actually manage to find your way to StackOverflow and even write some code without knowing how to do basic web research.
https://nodemcu.readthedocs.io/en/master/en/lua-developer-faq/
The firmware has replaced some standard Lua modules that don't align
well with the SDK structure with ESP8266-specific versions. For
example, the standard io and os libraries don't work, but have been
largely replaced by the NodeMCU node and file libraries.
https://nodemcu.readthedocs.io/en/master/en/modules/file/
The file module provides access to the file system and its individual
files.
I hope that's enough help...

How to write a file in Lua?

I just want to write a file to somewhere. Here is my code:
file = io.open("test.txt", "w")
file:write("Hello World")
file:close()
but the app crashes at the first line with:
attempt to call field 'open' (a nil value)
Even trying with the Lua online console keeps up the same error.
EDITED:
Here is my screenshot of the console
Try this code to dump the keys of io:
for k in next,io do
print(k)
end
Lua online outputs:
write
Which makes sense when you think about it:
Sandboxing works by restricting what can be done, including removal of unsafe functions.

Can I display the log of system call in Ruby?

I need to call a command(in a sinatra or rails app) like this:
`command sub`
Some log will be outputed when the command is executing.
I want to see the log displaying continuously in the process.
But I just can get the log string after it's done with:
result = `command sub`
So, is there a way to implement this?
On windows i have the best experience with IO.popen
Here is a sample
require 'logger'
$log = Logger.new( "#{__FILE__}.log", 'monthly' )
#here comes the full command line, here it is a java program
command = %Q{java -jar getscreen.jar #{$userid} #{$password}}
$log.debug command
STDOUT.sync = true
begin
# Note the somewhat strange 2> syntax. This denotes the file descriptor to pipe to a file. By convention, 0 is stdin, 1 is stdout, 2 is stderr.
IO.popen(command+" 2>&1") do |pipe|
pipe.sync = true
while str = pipe.gets #for every line the external program returns
#do somerthing with the capturted line
end
end
rescue => e
$log.error "#{__LINE__}:#{e}"
$log.error e.backtrace
end
There's six ways to do it, but the way you're using isn't the correct one because it waits for the process the return.
Pick one from here:
http://tech.natemurray.com/2007/03/ruby-shell-commands.html
I would use IO#popen3 if I was you.

Resources