omnisci query is throttling in NVIDIA GPU + CUDA - nvidia

I was trying to benchmark few of my queries in omnisci in GPU server.But I am experiencing queries are choking.Then I tried to experiment on sample data provided by omnisci itself flights dataset.
Below are my observation (I am using JDBC connector)
1.PreparedStatement pstmt2 = conn.prepareStatement("select * from flights_2008_7M natural join omnisci_countries");
pstmt2.execute(); # with 8 parallel threads
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 410.104 Driver Version: 410.104 CUDA Version: 10.0
|
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 Tesla P100-SXM2... On | 00000000:18:00.0 Off | 0 |
| N/A 43C P0 45W / 300W | 2343MiB / 16280MiB | 10% Default |
+-------------------------------+----------------------+----------------------+
| 1 Tesla P100-SXM2... On | 00000000:3B:00.0 Off | 0 |
| N/A 35C P0 42W / 300W | 2343MiB / 16280MiB | 15% Default |
+-------------------------------+----------------------+----------------------+
| 2 Tesla P100-SXM2... On | 00000000:86:00.0 Off | 0 |
| N/A 33C P0 42W / 300W | 2343MiB / 16280MiB | 14% Default |
+-------------------------------+----------------------+----------------------+
| 3 Tesla P100-SXM2... On | 00000000:AF:00.0 Off | 0 |
| N/A 38C P0 42W / 300W | 2343MiB / 16280MiB | 10% Default |
+-------------------------------+----------------------+----------------------+
2.PreparedStatement pstmt2 = conn.prepareStatement(
"select * from flights_2008_7M where dest = 'TPA' limit 100000");
pstmt2.execute(); # with 8 threads
Script hung and nothing is moving , in-fact no GPU utilization also .Just wanted to check if its configuration issue. How I can maximize GPU utilization and execute some complex queries with larger dataset .

are you sure the query isn't falling for CPU execution; I used an optimized DDLs to be sure the columns used by the query fit into VRAM memory.
To be sure the query isn't punting to CPU for execution go into the mapd_log/omnisci_server.INFO and after you run the query be sure you are not getting messages like that.
Query unable to run in GPU mode, retrying on CPU.
I did a brief try using the 1.2B+, not optimized table on an AWS server with 4xV100 GPUs and I had to change the parameter GPU-input-mem-limit=4, because of a bug (you can change adding this to the omnisci.conf file, then restarting the instance) with a default fragment size of 32M.
Have you changed the fragment size on your flight's table? Because the one in flights_7m is very low.
If not recreate the table with the default fragment size of 32000000 or bigger.
the execution time on a single thread is around 290ms
78 %, 84 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
81 %, 88 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
77 %, 84 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
76 %, 83 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
79 %, 85 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
73 %, 80 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
91 %, 99 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
77 %, 84 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
95 %, 100 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
76 %, 82 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
94 %, 100 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
93 %, 100 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
82 %, 88 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
95 %, 100 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
75 %, 82 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
94 %, 100 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
77 %, 83 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
78 %, 85 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
76 %, 83 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
75 %, 82 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
90 %, 97 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
74 %, 80 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
94 %, 100 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
75 %, 82 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
running four-thread the response time increase to around 1100ms with a slight increase of GPU utilization
93 %, 100 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
85 %, 93 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
89 %, 95 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
95 %, 100 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
90 %, 98 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
94 %, 100 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
89 %, 96 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
84 %, 91 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
92 %, 100 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
87 %, 95 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
94 %, 100 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
94 %, 100 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
89 %, 98 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
94 %, 100 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
89 %, 95 %, 1530 MHz, 16130 MiB, 6748 MiB, 9382 MiB
84 %, 91 %, 1530 MHz, 16130 MiB, 6924 MiB, 9206 MiB
88 %, 97 %, 1530 MHz, 16130 MiB, 8972 MiB, 7158 MiB
Some GPUs are less busy than others because the data is unbalanced; we should shard the table to get an even distribution between the GPUs.
The runtimes are so high because on a projection query like that the server process one fragment at a time (default 32M, so there is some overhead to back and forth some data from CPU and GPU and vice-versa.

Omnisci is designed as an analytical database so, it's not well suited to run simple projection queries with just a little filtering returning a lot of columns; nevertheless, a query like the one you are running is taking just 31 ms on my workstation that's using only two gaming class GPUs
select * from flights_2008_7m natural join omnisci_countries where dest='DEN' ;
7 rows returned.
Execution time: 29 ms, Total time: 31 ms.
As expected, the utilization percentage of GPUs is floating around one percent because the dataset is small, and the operations that are done on GPU are just a join and filtering.
To see something running on gpu you should run queries with more records and less columns projected (just the one you needs for your calculations) as this one
select so.name,sd.name,so.id,arrdelay, depdelay,st_area(so.omnisci_geo)
from flights_b
join omnisci_states sd on dest_state=sd.abbr
join omnisci_states so on origin_state=so.abbr
where dep_timestamp between '2007-04-05 00:00:00' and '2007-10-07 00:00:00'
and depdelay between 10 and 30
and depdelay>arrdelay limit 1000;
I changed the join condition because using natural join the tables are joined on rowid pseudo columns, so it was impossible to get more rows than the one on the geo table.
The query is run on the same dataset while the flights_b table contains 1.2 Billions of rows instead of 7 Million of the example
Because I'm on a gaming class GPU the st_area function is quite taxing, so this query takes 917ms to run; on a system with Tesla class GPU it would take a lot less
Here the output of nvidia-smi while running the query
0 %, 0 %, 1920 MHz, 10989 MiB, 540 MiB, 10449 MiB
0 %, 0 %, 1875 MHz, 10988 MiB, 538 MiB, 10450 MiB
88 %, 71 %, 1920 MHz, 10989 MiB, 540 MiB, 10449 MiB
85 %, 74 %, 1875 MHz, 10988 MiB, 538 MiB, 10450 MiB
0 %, 0 %, 1920 MHz, 10989 MiB, 540 MiB, 10449 MiB
0 %, 0 %, 1875 MHz, 10988 MiB, 538 MiB, 10450 MiB
Running from omnisql or a java tool like dbeaver using the jdbc driver is the same.
Have you tried your queries with a java tool like dbeaver and or tableau ? (the latter for query concurrency)

Related

Cannot find module '#vaadin/flow-frontend/Flow'

Vaadin suddenly stops to build my library with the following error. I already did the Vaadin dance (and a lot of more stuff) but I'm running out of ideas now. I try to build the library for production (but it also fails for dev).
I'm using Vaadin Flow. The issue tracker on Github redirected here for general community help - so I hope anyone has an idea how to solve this problem or what else I can try.
> Task vaadinBuildFrontend FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task 'vaadinBuildFrontend'.
> Webpack process exited with non-zero exit code.
Stderr: 'Hash: e1a6ace26ca6df86c87b
Version: webpack 4.46.0
Time: 6054ms
Built at: 12/22/2021 7:13:38 PM
Asset Size Chunks Chunk Names
VAADIN/build/vaadin-1-8595bda5c7958e210407.cache.js 894 KiB 1 [immutable]
VAADIN/build/vaadin-2-d5ce16eeb5d943cfe059.cache.js 284 KiB 2 [immutable]
VAADIN/build/vaadin-3-7d2fe309de5248ed4c09.cache.js 48.7 KiB 3 [immutable]
VAADIN/build/vaadin-4-2758512dfda3ea8392cc.cache.js 1.05 KiB 4 [immutable]
Entrypoint bundle =
[0] ./generated/vaadin.ts + 2 modules 91.2 KiB {0} [built]
| ./generated/vaadin.ts 18 bytes [built]
| ./generated/index.ts 337 bytes [built]
| ../node_modules/.pnpm/#vaadin/router#1.7.4/node_modules/#vaadin/router/dist/vaadin-router.js 90.9 KiB [built]
[1] ../node_modules/.pnpm/#vaadin/vaadin-themable-mixin#22.0.1/node_modules/#vaadin/vaadin-themable-mixin/vaadin-themable-mixin.js 7.07 KiB {2} [built]
[3] ../node_modules/.pnpm/lit#2.0.0/node_modules/lit/index.js + 3 modules 8.83 KiB {2} [built]
| ../node_modules/.pnpm/lit#2.0.0/node_modules/lit/index.js 122 bytes [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/reactive-element.js 5.89 KiB [built]
| ../node_modules/.pnpm/lit-element#3.0.2/node_modules/lit-element/lit-element.js 1.35 KiB [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/css-tag.js 1.46 KiB [built]
[4] ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/spacing.js 1.42 KiB {2} [built]
[5] ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/color.js 6.82 KiB {2} [built]
[6] ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/style.js 1.38 KiB {2} [built]
[8] ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/sizing.js 829 bytes {2} [built]
[9] ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/typography.js 2.99 KiB {2} [built]
[27] ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/version.js 313 bytes {2} [built]
[43] ../node_modules/.pnpm/#vaadin/button#22.0.1/node_modules/#vaadin/button/theme/lumo/vaadin-button.js 73 bytes {1} [built]
[80] ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/icons.js + 2 modules 31.5 KiB {2} [built]
| ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/icons.js 276 bytes [built]
| ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/iconset.js 15.5 KiB [built]
| ../node_modules/.pnpm/#vaadin/vaadin-lumo-styles#22.0.1/node_modules/#vaadin/vaadin-lumo-styles/vaadin-iconset.js 15.7 KiB [built]
[188] ../node_modules/.pnpm/#vaadin/common-frontend#0.0.17_lit#2.0.0/node_modules/#vaadin/common-frontend/ConnectionIndicator.js + 14 modules 35.9 KiB {2} [built]
| ../node_modules/.pnpm/#vaadin/common-frontend#0.0.17_lit#2.0.0/node_modules/#vaadin/common-frontend/ConnectionIndicator.js 14.3 KiB [built]
| ../node_modules/.pnpm/tslib#2.3.1/node_modules/tslib/tslib.es6.js 11.5 KiB [built]
| ../node_modules/.pnpm/lit#2.0.0/node_modules/lit/decorators.js 525 bytes [built]
| ../node_modules/.pnpm/lit#2.0.0/node_modules/lit/directives/class-map.js 85 bytes [built]
| ../node_modules/.pnpm/#vaadin/common-frontend#0.0.17_lit#2.0.0/node_modules/#vaadin/common-frontend/ConnectionState.js 4.36 KiB [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/custom-element.js 364 bytes [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/property.js 572 bytes [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/state.js 225 bytes [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/event-options.js 280 bytes [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/query.js 612 bytes [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/query-all.js 388 bytes [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/query-async.js 392 bytes [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/query-assigned-nodes.js 603 bytes [built]
| ../node_modules/.pnpm/lit-html#2.0.2/node_modules/lit-html/directives/class-map.js 1.1 KiB [built]
| ../node_modules/.pnpm/#lit/reactive-element#1.0.2/node_modules/#lit/reactive-element/decorators/base.js 666 bytes [built]
[217] ../node_modules/.pnpm/#vaadin/polymer-legacy-adapter#22.0.1/node_modules/#vaadin/polymer-legacy-adapter/style-modules.js + 1 modules 4.38 KiB {2} [built]
| ../node_modules/.pnpm/#vaadin/polymer-legacy-adapter#22.0.1/node_modules/#vaadin/polymer-legacy-adapter/style-modules.js 191 bytes [built]
| ../node_modules/.pnpm/#vaadin/polymer-legacy-adapter#22.0.1/node_modules/#vaadin/polymer-legacy-adapter/src/style-modules.js 4.17 KiB [built]
[222] ../build/frontend/generated-flow-imports-fallback.js + 56 modules 104 KiB {3} [built]
| ../build/frontend/generated-flow-imports-fallback.js 9.98 KiB [built]
| ./basic/variables.css 943 bytes [built]
| ./images/cropper/cropper.css 778 bytes [built]
| ./basic/spacer.css 116 bytes [built]
| ./tab-box/tab-box.css 458 bytes [built]
| ./text/text-area.css 103 bytes [built]
| ./buttons/internal-button.css 104 bytes [built]
| ./layouts/br-app-layout.css 206 bytes [built]
| ./buttons/flat-icon-button.css 242 bytes [built]
| ./modal/raven-modal.css 219 bytes [built]
| ./headings/headings.css 60 bytes [built]
| ./layouts/br-horizontal-layout.css 150 bytes [built]
| ./text/text-line-with-icon.css 182 bytes [built]
| ./input/br-inline-edit.css 159 bytes [built]
| ./buttons/action-button.css 890 bytes [built]
| + 42 hidden modules
[223] ../build/frontend/generated-flow-imports.js + 1 modules 1.72 KiB {4} [built]
| ../build/frontend/generated-flow-imports.js 1.41 KiB [built]
| ../build/flow-frontend/lumo-includes.ts 276 bytes [built]
+ 209 hidden modules
ERROR in ./generated/index.ts
Module not found: Error: Can't resolve '#vaadin/flow-frontend/Flow' in '/XXX/frontend/generated'
# ./generated/index.ts 2:0-50 3:33-37
# ./generated/vaadin.ts
ERROR in chunk bundle [entry]
VAADIN/build/vaadin-bundle-ca5b59ddaf6cebb1e7aa.cache.js
/XXX/node_modules/.pnpm/esbuild-loader#2.15.1_webpack#4.46.0/node_modules/esbuild-loader/dist/index.js??ref--4!/XXX/frontend/generated/vaadin.ts a8f7ce42c2f8c5128bbf70605fdcb37b
Unexpected token (4:33)
|
|
| const { serverSideRoutes } = new !(function webpackMissingModule() { var e = new Error("Cannot find module '#vaadin/flow-frontend/Flow'"); e.code = 'MODULE_NOT_FOUND'; throw e; }())({
| imports: () => Promise.all(/* import() */[__webpack_require__.e(2), __webpack_require__.e(4)]).then(__webpack_require__.bind(null, 223))
| });
ERROR in frontend/generated/index.ts:17:22
TS2307: Cannot find module '#vaadin/flow-frontend/Flow' or its corresponding type declarations.
15 |
16 | // import Flow module to enable navigation to Vaadin server-side views
> 17 | import { Flow } from '#vaadin/flow-frontend/Flow';
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
18 |
19 | const { serverSideRoutes } = new Flow({
20 | imports: () => import('../../build/frontend/generated-flow-imports')
Child HtmlWebpackCompiler:
Asset Size Chunks Chunk Names
__child-HtmlWebpackPlugin_0 4.4 KiB 0 HtmlWebpackPlugin_0
Entrypoint HtmlWebpackPlugin_0 = __child-HtmlWebpackPlugin_0
[0] ../node_modules/.pnpm/html-webpack-plugin#4.5.1_webpack#4.46.0/node_modules/html-webpack-plugin/lib/loader.js!./index.html 830 bytes {0} [built]
<i> [build-status] 3 errors and 0 warnings were reported.
<i> [build-status] : Failed to compile.
'
// Edit 1
I already tried all of this (in this order):
rm -rf ~/.pnpm-store ~/.vaadin
rm -rf package.json pnpm-lock.json pnpmfile.json tsconfig.json webpack.config.js webpack.generated.js .npmrc frontend/generated/ frontend/index.html build/ target/ node_modules pnpm-lock.yaml pnpmfile.js types.d.ts
./gradlew vaadinClean && ./gradlew vaadinPrepareFrontend && ./gradlew vaadinBuildFrontend
I had the same problem migrating from V21 to V22. I fixed it by deleting the target folder (and generated files in the root folder like tsconfig etc) and rebuilding again. Worked perfectly afterwards with no further problems. My suspicion was something stayed in the target folder from V21 and confused the build.
I found the solution by rebuilding step by step the production and testing where it failed.
TL;DR: include dependency com.vaadin:flow-client if you don't use com.vaadin:vaadin-core
As I stated, I try to build a Vaadin library (to share components across different nodes). Therefor, I didn't include the dependency com.vaadin:vaadin-core but the dependencies I actually needed (also to avoid version conflicts of transitive dependencies).
For some reason (probably due to upgrading the Vaadin version), I now also need the dependency com.vaadin:flow-client. Without this specific dependency the above mentioned error can be reproduced.

Is there a way to purge FontAwesome with Webpack?

my project is currently using Rails 6.1, #fortawesome/fontawesome-free 5.15.3 & Heroku. My config is straightforward:
// package.json
{
"name": "web",
"dependencies": {
"#fortawesome/fontawesome-free": "^5.15.3",
...
},
...
}
// app/javascript/packs/application.js
...
Rails.start()
...
import "stylesheets/application"
import "#fortawesome/fontawesome-free/css/all"
I use FontAwesome icons as CSS classes, which I call within my .html.erb files:
<i class="fas fa-users"></i>
I use very few icons from the framework (< 20), yet Webpack seems to compile the whole framework. Everything works fine, I am just surprised how the FA framework is still compiled in all possible formats (css (the only one I need), svg, webfonts).
Here are my Heroku build logs:
Version: webpack 4.46.0
Time: 19571ms
Built at: 07/21/2021 9:10:54 PM
Asset Size Chunks Chunk Names
css/application-e87f29dc.css 72.8 KiB 0 [emitted] [immutable] application
css/application-e87f29dc.css.br 13.7 KiB [emitted]
css/application-e87f29dc.css.gz 16.4 KiB [emitted]
js/application-5ac2d3a4589f0bac8765.js 128 KiB 0 [emitted] [immutable] application
js/application-5ac2d3a4589f0bac8765.js.br 25.4 KiB [emitted]
js/application-5ac2d3a4589f0bac8765.js.gz 29.3 KiB [emitted]
js/application-5ac2d3a4589f0bac8765.js.map 370 KiB 0 [emitted] [dev] application
js/application-5ac2d3a4589f0bac8765.js.map.br 75.2 KiB [emitted]
js/application-5ac2d3a4589f0bac8765.js.map.gz 86.9 KiB [emitted]
manifest.json 1.82 KiB [emitted]
manifest.json.br 348 bytes [emitted]
manifest.json.gz 403 bytes [emitted]
media/webfonts/fa-brands-400-216edb96.svg 730 KiB [emitted] [big]
media/webfonts/fa-brands-400-216edb96.svg.br 218 KiB [emitted]
media/webfonts/fa-brands-400-216edb96.svg.gz 249 KiB [emitted] [big]
media/webfonts/fa-brands-400-329a95a9.woff 87.9 KiB [emitted]
media/webfonts/fa-brands-400-89a52ae1.eot 131 KiB [emitted]
media/webfonts/fa-brands-400-89a52ae1.eot.br 81.9 KiB [emitted]
media/webfonts/fa-brands-400-89a52ae1.eot.gz 88.6 KiB [emitted]
media/webfonts/fa-brands-400-9e138496.ttf 131 KiB [emitted]
media/webfonts/fa-brands-400-9e138496.ttf.br 81.9 KiB [emitted]
media/webfonts/fa-brands-400-9e138496.ttf.gz 88.5 KiB [emitted]
media/webfonts/fa-brands-400-c1210e5e.woff2 75 KiB [emitted]
media/webfonts/fa-regular-400-1017bce8.ttf 32.9 KiB [emitted]
media/webfonts/fa-regular-400-1017bce8.ttf.br 15 KiB [emitted]
media/webfonts/fa-regular-400-1017bce8.ttf.gz 15.9 KiB [emitted]
media/webfonts/fa-regular-400-19e27d34.svg 141 KiB [emitted]
media/webfonts/fa-regular-400-19e27d34.svg.br 30.3 KiB [emitted]
media/webfonts/fa-regular-400-19e27d34.svg.gz 36.3 KiB [emitted]
media/webfonts/fa-regular-400-36722648.woff 15.9 KiB [emitted]
media/webfonts/fa-regular-400-4079ae2d.eot 33.2 KiB [emitted]
media/webfonts/fa-regular-400-4079ae2d.eot.br 15.1 KiB [emitted]
media/webfonts/fa-regular-400-4079ae2d.eot.gz 15.9 KiB [emitted]
media/webfonts/fa-regular-400-68c5af1f.woff2 13 KiB [emitted]
media/webfonts/fa-solid-900-07c3313b.ttf 198 KiB [emitted]
media/webfonts/fa-solid-900-07c3313b.ttf.br 90.3 KiB [emitted]
media/webfonts/fa-solid-900-07c3313b.ttf.gz 100 KiB [emitted]
media/webfonts/fa-solid-900-13de59f1.svg 897 KiB [emitted] [big]
media/webfonts/fa-solid-900-13de59f1.svg.br 199 KiB [emitted]
media/webfonts/fa-solid-900-13de59f1.svg.gz 250 KiB [emitted] [big]
media/webfonts/fa-solid-900-ada6e6df.woff2 76.4 KiB [emitted]
media/webfonts/fa-solid-900-c6ec0800.woff 99.3 KiB [emitted]
media/webfonts/fa-solid-900-efbd5d20.eot 198 KiB [emitted]
media/webfonts/fa-solid-900-efbd5d20.eot.br 90.4 KiB [emitted]
media/webfonts/fa-solid-900-efbd5d20.eot.gz 100 KiB [emitted]
Entrypoint application = css/application-e87f29dc.css js/application-5ac2d3a4589f0bac8765.js js/application-5ac2d3a4589f0bac8765.js.map
[0] ./node_modules/stimulus/index.js + 38 modules 77.4 KiB {0} [built]
| 39 modules
[2] (webpack)/buildin/module.js 552 bytes {0} [built]
[5] ./app/javascript/stylesheets/application.scss 39 bytes {0} [built]
[7] ./app/javascript/controllers sync _controller\.js$ 243 bytes {0} [built]
[8] ./app/javascript/controllers/forecasts_controller.js 3.28 KiB {0} [optional] [built]
[9] ./app/javascript/controllers/hello_controller.js 2.67 KiB {0} [optional] [built]
[10] ./app/javascript/controllers/search_controller.js 4.37 KiB {0} [optional] [built]
[11] ./app/javascript/packs/application.js + 4 modules 16.7 KiB {0} [built]
| ./app/javascript/packs/application.js 585 bytes [built]
| ./app/javascript/controllers/index.js 742 bytes [built]
| + 3 hidden modules
+ 6 hidden modules
WARNING in asset size limit: The following asset(s) exceed the recommended size limit (244 KiB).
This can impact web performance.
Assets:
media/webfonts/fa-solid-900-13de59f1.svg (897 KiB)
media/webfonts/fa-brands-400-216edb96.svg (730 KiB)
media/webfonts/fa-solid-900-13de59f1.svg.gz (250 KiB)
media/webfonts/fa-brands-400-216edb96.svg.gz (249 KiB)
WARNING in webpack performance recommendations:
You can limit the size of your bundles by using import() or require.ensure to lazy load some parts of your application.
For more info visit https://webpack.js.org/guides/code-splitting/
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js??ref--5-1!node_modules/postcss-loader/src/index.js??ref--5-2!node_modules/#fortawesome/fontawesome-free/css/all.css:
Entrypoint mini-css-extract-plugin = *
18 modules
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js??ref--6-1!node_modules/postcss-loader/src/index.js??ref--6-2!node_modules/sass-loader/dist/cjs.js??ref--6-3!app/javascript/stylesheets/application.scss:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js??ref--6-1!./node_modules/postcss-loader/src??ref--6-2!./node_modules/sass-loader/dist/cjs.js??ref--6-3!./app/javascript/stylesheets/application.scss 43.7 KiB {0} [built]
+ 1 hidden module
Asset precompilation completed (58.34s)
Should I worry about the size of these files like Webpack warns me? Is there any way to purge FontAwesome (like I do with Tailwind) so that only the used icons are compiled? Am I missing something about how Webpack works?
you could import only icons you wanna use
import { library } from "#fortawesome/fontawesome-svg-core";
import {
faFacebookSquare,
faGooglePlusSquare
} from "#fortawesome/free-brands-svg-icons";
library.add(faFacebookSquare, faGooglePlusSquare);
Note that if those icons do not show, then you maybe need dom helper
import { library, dom } from "#fortawesome/fontawesome-svg-core";
import {
faFacebookSquare,
faGooglePlusSquare
} from "#fortawesome/free-brands-svg-icons";
library.add(faFacebookSquare, faGooglePlusSquare);
// Kicks off the process of finding <i> tags and replacing with <svg>
dom.watch();

FFmpeg stream stops after a certain time

We have a little Nodejs app, which starts a stream process, with a child_process.spawn. On the client-side, we have an HTML5-canvas element, which records the video data new MediaRecorder(canvas.captureStream(30), config), then this client sends its data to our Nodejs server over a WebSocket connection. We using FFmpeg for video encoding and decoding, then we send the data to our 3-rd party service (MUX), which accepts the stream and broadcasts them. Sadly the process continuously loses its fps, and after in general 1 minute, stops with an interesting error code. (when we save the video result locally instead of streaming via rtmps, it works perfectly.
*The whole system is in docker.
The error:
stderr: [tls # 0x7f998e7bca40] Error in the pull function.
Our_app_logs: | av_interleaved_write_frame(): I/O error
Our_app_logs: | [flv # 0x7f998eeb1680] Failed to update header with correct duration.
Our_app_logs: | [flv # 0x7f998eeb1680] Failed to update header with correct filesize.
Our_app_logs: | Error writing trailer of rtmps://global-live.mux.com/app/94e85197-78a3-f092-3437-03d93aba74e0: I/O error
Our_app_logs: | <Buffer 5b 74 6c 73 20 40 20 30 78 37 66 39 39 38 65 37 62 63 61 34 30 5d 20 45 72 72 6f 72 20 69 6e 20 74 68 65 20 70 75 6c 6c 20 66 75 6e 63 74 69 6f 6e 2e ... >
Our_app_logs: | stderr: frame= 1478 fps= 25 q=23.0 Lsize=
402kB time=00:01:02.89 bitrate= 52.4kbits/s speed=1.05x
Our_app_logs: | video:369kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 9.034639%
Our_app_logs: | <Buffer 66 72 61 6d 65 3d 20 31 34 37 38 20 66 70 73 3d 20 32 35 20 71 3d 32 33 2e 30 20 4c 73 69 7a 65 3d 20 20 20 20 20 34 30 32 6b 42 20 74 69 6d 65 3d 30 ... >
Our_app_logs: | stderr: [tls # 0x7f998e7bca40] <Buffer 5b
74 6c 73 20 40 20 30 78 37 66 39 39 38 65 37 62 63 61 34 30 5d 20>
Our_app_logs: | stderr: The specified session has been invalidated for some reason.
Our_app_logs: | <Buffer 54 68 65 20 73 70 65 63 69 66 69 65 64 20 73 65 73 73 69 6f 6e 20 68 61 73 20 62 65 65 6e 20 69 6e 76 61 6c 69 64 61 74 65 64 20 66 6f 72 20 73 6f 6d ... >
Our_app_logs: | stderr: Last message repeated 1 times
Our_app_logs: | <Buffer 20 20 20 20 4c 61 73 74 20 6d 65 73 73 61 67 65 20 72 65 70 65 61 74 65 64 20 31 20 74 69 6d 65 73 0a>
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: frame I:13 Avg QP: 5.39 size:
2478
Our_app_logs: | <Buffer 66 72 61 6d 65 20 49 3a 31 33 20 20 20 20 41 76 67 20 51 50 3a 20 35 2e 33 39 20 20 73 69 7a 65 3a 20 20 32 34 37 38 0a>
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: frame P:1465 Avg QP:13.51 size:
235
Our_app_logs: | <Buffer 66 72 61 6d 65 20 50 3a 31 34 36 35 20 20 41 76 67 20 51 50 3a 31 33 2e 35 31 20 20 73 69 7a 65 3a 20 20 20 32 33 35 0a>
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: mb I I16..4: 99.2% 0.1% 0.7%
Our_app_logs: | <Buffer 6d 62 20 49 20 20 49 31 36 2e 2e 34 3a 20 39 39 2e 32 25 20 20 30 2e 31 25 20 20 30 2e 37 25 0a>
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: mb P I16..4: 0.3% 0.0% 0.0% P16..4: 0.1% 0.0% 0.0% 0.0% 0.0% skip:99.6%
Our_app_logs: | <Buffer 6d 62 20 50 20 20 49 31 36 2e 2e 34 3a 20 20 30 2e 33 25 20 20 30 2e 30 25 20 20 30 2e 30 25 20 20 50 31 36 2e 2e 34 3a 20 20 30 2e 31 25 20 20 30 2e ... >
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: 8x8 transform intra:0.3% inter:17.3%
Our_app_logs: | <Buffer 38 78 38 20 74 72 61 6e 73 66 6f 72 6d 20 69 6e 74 72 61 3a 30 2e 33 25 20 69 6e 74 65 72 3a 31 37 2e 33 25 0a>
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: coded y,uvDC,uvAC intra: 1.4% 6.9%
4.7% inter: 0.0% 0.0% 0.0%
Our_app_logs: | <Buffer 63 6f 64 65 64 20 79 2c 75 76 44 43 2c 75 76 41 43 20 69 6e 74 72 61 3a 20 31 2e 34 25 20 36 2e 39 25 20 34 2e 37 25 20 69 6e 74 65 72 3a 20 30 2e 30 ... >
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: i16 v,h,dc,p: 90% 5% 5% 0%
Our_app_logs: | <Buffer 69 31 36 20 76 2c 68 2c 64 63 2c 70 3a 20 39 30 25 20 20 35 25 20 20 35 25 20 20 30 25 0a>
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 23%
18% 51% 6% 0% 0% 0% 0% 3%
Our_app_logs: | <Buffer 69 38 20 76 2c 68 2c 64 63 2c 64 64 6c 2c 64 64 72 2c 76 72 2c 68 64 2c 76 6c 2c 68 75 3a 20 32 33 25 20 31 38 25 20 35 31 25 20 20 36 25 20 20 30 25 ... >
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 33%
25% 40% 0% 0% 0% 0% 0% 0%
Our_app_logs: | <Buffer 69 34 20 76 2c 68 2c 64 63 2c 64 64 6c 2c 64 64 72 2c 76 72 2c 68 64 2c 76 6c 2c 68 75 3a 20 33 33 25 20 32 35 25 20 34 30 25 20 20 30 25 20 20 30 25 ... >
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: i8c dc,h,v,p: 86% 7% 6% 0%
Our_app_logs: | <Buffer 69 38 63 20 64 63 2c 68 2c 76 2c 70 3a 20 38 36 25 20 20 37 25 20 20 36 25 20 20 30 25 0a>
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: Weighted P-Frames: Y:0.1% UV:0.1%
Our_app_logs: | <Buffer 57 65 69 67 68 74 65 64 20 50 2d 46 72 61 6d 65 73 3a 20 59 3a 30 2e 31 25 20 55 56 3a 30 2e 31 25 0a>
Our_app_logs: | stderr: [libx264 # 0x7f998e790080] <Buffer 5b 6c 69 62 78 32 36 34 20 40 20 30 78 37 66 39 39 38 65 37 39 30 30
38 30 5d 20>
Our_app_logs: | stderr: kb/s:2041.23
Our_app_logs: | <Buffer 6b 62 2f 73 3a 32 30 34 31 2e 32 33 0a>
Our_app_logs: | stderr: Conversion failed!
Our_app_logs: | <Buffer 43 6f 6e 76 65 72 73 69 6f 6e 20 66 61 69 6c 65 64 21 0a>
Our_app_logs: | close, code: 1, signal: null
Our_app_logs: | from react application: 14203
Our_app_logs: | Status ok...
Our_app_logs: | Data ok...
Our_app_logs: | FFmpeg ok...
Our_app_logs: | Writeable ok... <Buffer c4 81 0e 11 00 00 00 00 01 61 c7 80 5b 00 b6 72 03 bc 00 b7 03 de 59 7f 3c 27 80 01 b3 87 bc b2 e6 84 d0 f0 02 2d c0 00 00 00 01 61 00 08 70 c7 80 5b ... > undefined
Our_app_logs: | stderr: ffmpeg version 4.2.4 Copyright (c)
2000-2020 the FFmpeg developers
Our_app_logs: | built with gcc 9.2.0 (Alpine 9.2.0)
Our_app_logs: | configuration: --prefix=/usr --enable-avresample --enable-avfilter --enable-gnutls --enable-gpl --enable-libass --enable-libmp3lame --enable-libvorbis --enable-libvpx --enable-libxvid --enable-libx264 --enable-libx265 --enable-libtheora --enable-libv4l2 --enable-postproc --enable-pic --enable-pthreads --enable-shared --enable-libxcb --disable-stripping --disable-static --disable-librtmp --enable-vaapi --enable-vdpau --enable-libopus --disable-debug
Our_app_logs: | <Buffer 66 66 6d 70 65 67 20 76 65 72 73 69 6f 6e 20 34 2e 32 2e 34 20 43 6f 70 79 72 69 67 68 74 20 28 63 29 20 32 30 30 30 2d 32 30 32 30 20 74 68 65 20 46 ... >
Our_app_logs: | stderr: libavutil 56. 31.100 / 56. 31.100
Our_app_logs: | libavcodec 58. 54.100 / 58. 54.100
Our_app_logs: | libavformat 58. 29.100 / 58. 29.100
Our_app_logs: | libavdevice 58. 8.100 / 58. 8.100
Our_app_logs: | libavfilter 7. 57.100 / 7. 57.100
Our_app_logs: | libavresample 4. 0. 0 / 4. 0. 0
Our_app_logs: | libswscale 5. 5.100 / 5. 5.100
Our_app_logs: | libswresample 3. 5.100 / 3. 5.100
Our_app_logs: | libpostproc 55. 5.100 / 55. 5.100
Our_app_logs: | <Buffer 20 20 6c 69 62 61 76 75 74 69 6c 20 20 20 20 20 20 35 36 2e 20 33 31 2e 31 30 30 20 2f 20 35 36 2e 20 33 31 2e 31 30 30 0a 20 20 6c 69 62 61 76 63 6f ... >
Our_app_logs: | stderr: [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | Last message repeated 4 times
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | <Buffer 5b 68 32 36 34 20 40 20 30 78 37 66 32 39 39 66 34 34 66 36 30 30 5d 20 6e 6f 6e 2d 65 78 69 73 74 69 6e 67 20 50 50 53 20 31 34 20 72 65 66 65 72 65 ... >
Our_app_logs: | stderr: [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | Last message repeated 5 times
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_headerOur_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] non-existing PPS 14 referenced
Our_app_logs: | [h264 # 0x7f299f44f600] decode_slice_header error
Our_app_logs: | [h264 # 0x7f299f44f600] no frame!
The FFmpeg config:
const FFMPEG_CONFIG = [
'-i',
'-',
// video codec config: low latency, adaptive bitrate
// '-vcodec',
// 'copy',
'-c:v',
'libx264',
'-preset',
'veryfast',
'-tune',
'zerolatency',
// audio codec config: sampling frequency (11025, 22050, 44100), bitrate 64 kbits
'-c:a',
'aac',
'-ar',
'44100',
'-b:a',
'64k',
//force to overwrite
'-y',
// used for audio sync
'-use_wallclock_as_timestamps',
'1',
'-async',
'1',
//'-filter_complex', 'aresample=44100', // resample audio to 44100Hz, needed if input is not 44100
//'-strict', 'experimental',
'-bufsize',
'1000',
'-f',
'flv',
];
The process:
const process = child_process.spawn('ffmpeg', [
...FFMPEG_CONFIG,
// 'local.bin',
url,
]);
process.stderr.on('data', data => {
console.log(`stderr: ${data}`, data);
});
process.stdin.on('error', e => {
console.log('FFmpeg STDIN Error', e);
});
process.on('error', err => console.log(err));
process.on('close', (code, signal) => {
console.log(`close, code: ${code}, signal: ${signal}`);
});
The writing:
if (!Buffer.isBuffer(data)) return;
if (!process.stdin.writable) return;
process.stdin.write(data);
Im found another FFmpeg config that works perfectly.
-f lavfi -re -i anullsrc -f h264 -thread_queue_size 1024 -framerate 10 -probesize 100 -i - -vcodec copy -acodec aac -g 20 -f flv
Update: Another issue with this, if you use docker, the built-in docker network is limited. So you have to build a network manually between the services. Why? The data transfer between the services is huge and with this limitation, the FFmpeg not receives enough data.

Webpacker: The resolved_paths option has been deprecated. Use additional_paths instead

I bumped webpacker from 4.x to 5.2.1 and started getting this warning:
The resolved_paths option has been deprecated. Use additional_paths instead.
This seems straightforward enough, my config/webpacker.yml was almost unmodified:
# Additional paths webpack should lookup modules
# ['app/assets', 'engine/foo/app/assets']
resolved_paths: [
'app/assets',
]
But doing a simple s/resolved_paths/additional_paths/ there doesn't work:
[Webpacker] Compiling...
[Webpacker] Compilation failed:
Hash: 7448f36a43523a84e146
Version: webpack 4.44.1
Time: 5803ms
Built at: 10/15/2020 11:57:06 AM
Asset Size Chunks Chunk Names
js/application-a019b363e4513fe092e6.js 3.02 MiB application [emitted] [immutable] application
js/application-a019b363e4513fe092e6.js.map 3.03 MiB application [emitted] [dev] application
js/hello_react-40e806bdb6de496532d8.js 1.05 MiB hello_react [emitted] [immutable] hello_react
js/hello_react-40e806bdb6de496532d8.js.map 1.21 MiB hello_react [emitted] [dev] hello_react
js/server_rendering-9cd9dcc6e1cebb2a8063.js 2.25 MiB server_rendering [emitted] [immutable] server_rendering
js/server_rendering-9cd9dcc6e1cebb2a8063.js.map 2.44 MiB server_rendering [emitted] [dev] server_rendering
manifest.json 1.05 KiB [emitted]
Entrypoint application = js/application-a019b363e4513fe092e6.js js/application-a019b363e4513fe092e6.js.map
Entrypoint hello_react = js/hello_react-40e806bdb6de496532d8.js js/hello_react-40e806bdb6de496532d8.js.map
Entrypoint server_rendering = js/server_rendering-9cd9dcc6e1cebb2a8063.js js/server_rendering-9cd9dcc6e1cebb2a8063.js.map
[./app/javascript/channels sync recursive _channel\.js$] ./app/javascript/channels sync _channel\.js$ 160 bytes {application} [built]
[./app/javascript/channels/index.js] 211 bytes {application} [built]
[./app/javascript/components sync recursive ^\.\/.*$] ./app/javascript/components sync ^\.\/.*$ 2.42 KiB {application} {server_rendering} [built]
[./app/javascript/packs/application.js] 10.3 KiB {application} [built]
[./app/javascript/packs/hello_react.jsx] 1.05 KiB {hello_react} [built]
[./app/javascript/packs/server_rendering.js] 301 bytes {server_rendering} [built]
[./node_modules/webpack/buildin/amd-options.js] (webpack)/buildin/amd-options.js 80 bytes {application} {server_rendering} [built]
[./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 905 bytes {application} {server_rendering} [built]
[./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 552 bytes {application} {server_rendering} [built]
+ 474 hidden modules
ERROR in ./app/javascript/components/menu/MenuComponent.jsx
Module not found: Error: Can't resolve 'images/ellipsis-v.svg' in '/home/me/app/javascript/components/menu'
So obviously additional_paths isn't just a drop-in replacement, even though the docs suggest it should be.
Before I jump into the source to try to understand what's happening here, anyone got a quick fix?
The error is logged in the web browser's console, and it comes from the npm package. You need to upgrade both: the webpacker gem and #rails/webpacker npm package.

Crash log becomes unknown

My app crashes and I am finding where the problem is however I am suspecting an abandoned memory. To find where I went wrong I looked at my crash logs. However it just displays unknown and the log is this.
Incident Identifier: 1EE91CB2-E67A-4D8B-84BF-19E4B3B98747
CrashReporter Key: b836d79e5ce230ad9b3663fe72a73cbf9aa7bd51
Hardware Model: iPhone6,1
OS Version: iPhone OS 7.0.4 (11B554a)
Kernel Version: Darwin Kernel Version 14.0.0: Fri Sep 27 23:08:32 PDT 2013; root:xnu-2423.3.12~1/RELEASE_ARM64_S5L8960X
Date: 2013-11-17 20:37:45 +0900
Time since snapshot: 105 ms
Free pages: 6464
Active pages: 67311
Inactive pages: 29500
Speculative pages: 4757
Throttled pages: 0
Purgeable pages: 0
Wired pages: 92047
File-backed pages: 11060
Anonymous pages: 90508
Compressions: 578224
Decompressions: 122821
Compressor Size: 60458
Uncompressed Pages in Compressor: 182193
Largest process: backboardd
Processes
Name <UUID> rpages recent_max fds [reason] (state)
MobileMail <387c38c23acc32dc912a5088ef6b2b66> 14173 14173 200 [vm-pageshortage] (continuous)
MobileSMS <fd98ac3fad52357e879f7fbd74f64bd1> 3095 3095 200 [vm-pageshortage] (background)
tccd <a4190e0e6f6b3d23b06326c8935a5bb4> 238 238 200 [vm-pageshortage] (daemon)
kbd <4350c1efc23b3182809fcb6d8a7885fd> 3645 3645 200 [vm-pageshortage] (daemon)
librariand <f9e63967978833b585958d2d38f51e16> 1334 1334 200 [vm-pageshortage] (daemon)
MyAPP <7049d9f9e2d932a5a72e8046800f8562> 117514 117514 200 [vm-pageshortage] (frontmost) (resume)
ptpd <872030b325d9383b95a5434d41f77b25> 1277 1277 200 (daemon)
identityservices <cd1fff47d6ad3b0f85cdc5fb39d8b53e> 658 658 100 (daemon)
vmd <19de7c691b3137fea83e23261df6802a> 220 220 50 (daemon)
imagent <5080234127f5363fb539ffc3965af6e2> 596 596 50 (daemon)
syslogd <5c3a246617d3399e977efc84c2e29df2> 709 709 50 (daemon)
wifid <ca4f06468bf03d0f8847089e8cd320f1> 614 614 50 (daemon)
locationd <10f268a18d5d3343ab21be48bb005ecf> 1502 1502 100 (daemon)
powerd <a1fc82c399dc36e2b18a6fbb3d936a88> 164 164 100 (daemon)
iaptransportd <4c622f6c4295395997e7a6ec783b4623> 267 267 100 (daemon)
mediaserverd <f067d4c2a21a30fbabab30d9c94ccbd3> 95553 95667 50 (daemon)
mDNSResponder <2b8ad561938f3fc0a6255b607f503040> 391 391 100 (daemon)
apsd <ceb7855af3a63c2682fab13d5e0aeb82> 731 731 100 (daemon)
dataaccessd <b155854105f531248c2a44fd3733d59e> 1491 1491 200 (daemon)
sharingd <550630f3f5dc3f0aa08ba04876d82e6d> 594 594 50 (daemon)
itunesstored <0d023473a4a93c93a531210de784b155> 2020 2020 200 (daemon)
calaccessd <d6960604dc2c37499cd597b510055d7e> 667 667 200 (daemon)
SpringBoard <fe632b47e4ee342baf4b3701cd11b242> 18539 18539 100
backboardd <2f84882cf3693dfb921f4e0d38966f50> 138729 138729 50 (daemon)
fseventsd <a0223d346d4431a5ba1caabf8505b40e> 713 713 50 (daemon)
lockdownd <e280cf66209e3be980809a7d93eea76e> 382 382 50 (daemon)
configd <bb6e02e801a93ef896f2f3cf5cbb00fe> 710 710 50 (daemon)
fairplayd.H2 <da123871e48a3b6a9a2998f428e5c05a> 151 151 100 (daemon)
aggregated <6189b3e3d0c83a879b99cf7cd566dffb> 1090 1090 100 (daemon)
BTServer <3669aefbfb2e3577b17bde9598feda76> 511 511 100 (daemon)
distnoted <c40569cbea09312b9310bc74cbc88e29> 177 177 100 (daemon)
UserEventAgent <0d33b64c0c003a65b9a87c6622921781> 796 796 100 (daemon)
networkd <c7aa87e0c2d33d379a09598281a5e3ee> 873 873 100 (daemon)
biometrickitd <96a562b32d2f3ae0b3d23706c2f5d5ac> 296 296 100 (daemon)
filecoordination <72a4cec360d435c09cb83d2316317288> 324 324 200 (daemon)
ubd <231dc91e9b11307eb98874f9bbfaa86c> 1256 1256 100 (daemon)
EscrowSecurityAl <e14ee8c5bc0f3447b6cdaad44ab402da> 236 236 200 (daemon)
touchsetupd <0c315f01ae8d3675ad1a4eda4c9b18bb> 211 211 200 (daemon)
notification_pro <72244e97bc7d33408cd01f8c7fb7d2eb> 127 127 200 (daemon)
DTMobileIS <5fec282802c03cb49ee4a1cff408ac58> 17147 17147 200 (daemon)
cplogd <96828e7047bf36e2a1cdffcc1be700f8> 149 149 200 (daemon)
pasteboardd <6a060fcef15735f6884cc7e7f388d7bb> 139 139 200 (daemon)
wirelessproxd <9f112d11a5f734019013a43e9fc677a9> 93 93 200 (daemon)
CommCenter <ba4a2aecbe913f0ca31c8902e444db0d> 1867 1867 100 (daemon)
notifyd <bd919e93d6293562af0b7ec0e21247a0> 300 300 100 (daemon)
**End**
How can I find which line was wrong?
It looks like your running out of memory... I recommend adding some more code in didReciveMemoryWarning getting rid of things that you don't need... If this is only from one person, though, that may not be your app, it may just be that they have an old phone, or they are doing way to many things in the background.

Resources