Playwright chrome instance does not respond while opening new page - docker

I am using playwright-rust implementation inside the docker, trying to automate a website, but the issue I am facing is when it reach to open a new page, it stop responding in side the docker. There are the logs I get from the playwright.
[2023-01-12T18:55:38Z DEBUG playwright::imp::core::transport] SEND Req { id: 10, guid: "browser-type#43af88863e863765963c2986930eac42", method: "launch", params: {"headless": Bool(true)} }
None
Committing offsets: Ok(())
[2023-01-12T18:55:38Z DEBUG playwright::imp::core::transport] RECV {"guid":"browser-type#43af88863e863765963c2986930eac42","method":"__create__","params":{"type":"Browser","initializer":{"version":"92.0.4498.0","name":"chromium"},"guid":"browser#aeeb8a7bc1cd03be692db83160e2636d"}}
[2023-01-12T18:55:38Z DEBUG playwright::imp::core::transport] RECV {"id":10,"result":{"browser":{"guid":"browser#aeeb8a7bc1cd03be692db83160e2636d"}}}
[2023-01-12T18:55:38Z DEBUG playwright::imp::core::transport] SEND Req { id: 11, guid: "browser#aeeb8a7bc1cd03be692db83160e2636d", method: "newContext", params: {"sdkLanguage": String("")} }
[2023-01-12T18:55:38Z DEBUG playwright::imp::core::transport] RECV {"guid":"browser#aeeb8a7bc1cd03be692db83160e2636d","method":"__create__","params":{"type":"BrowserContext","initializer":{"isChromium":true},"guid":"browser-context#744e111f024c383f760775a92d3f98cb"}}
[2023-01-12T18:55:38Z DEBUG playwright::imp::core::transport] RECV {"id":11,"result":{"context":{"guid":"browser-context#744e111f024c383f760775a92d3f98cb"}}}
[2023-01-12T18:55:38Z DEBUG playwright::imp::core::transport] SEND Req { id: 12, guid: "browser-context#744e111f024c383f760775a92d3f98cb", method: "newPage", params: {} }''
This is the Rust code where it is being called.
pub async fn build_browser(
proxy: Option<ProxySettings>,
playwright: Arc<Playwright>,
) -> Result<(Arc<Browser>, Arc<BrowserContext>)> {
debug!("creating new browser");
//should be switchable to any browser later with configurations for now chromimum
let browser_type = playwright.chromium();
let mut browser_launcher = browser_type.launcher().headless(true);
if !proxy.is_none() {
browser_launcher = browser_launcher.proxy(proxy.unwrap());
}
let browser = Arc::new(browser_launcher.launch().await?);
let browser_context = Arc::new(browser.context_builder().build().await?);
debug!("browser has been created");
Ok((browser, browser_context))
}
pub async fn open_new(
email: &str,
password_hash: &str,
proxy: Option<ProxySettings>,
playwright: Arc<Playwright>,
cookies_manager: Arc<Mutex<CookiesManager>>,
) -> Result<(Self, bool)> {
info!("We are trying to login in the browser");
let (browser, browser_context) = Self::build_browser(proxy, playwright.clone()).await?;
// keep using tab for better understanding
info!("browser context built");
let _tab = browser_context.new_page().await.unwrap();
//...
}
after browser_context.new_page().await.unwrap(); is called it does not give any response keep hanging there. But contrary to this it works with run out of the docker. And one more interesting thing is if I see the docker container process we can see chrome instance is running there.
UID PID PPID C STIME TTY TIME CMD
root 19272 19201 0 22:57 ? 00:00:10 /app/target/release/service
root 19550 19272 0 22:57 ? 00:00:00 /bin/sh /root/.cache/ms-playwright/playwright-rust/driver/playwright.sh run-driver
root 19554 19550 0 22:57 ? 00:00:00 /root/.cache/ms-playwright/playwright-rust/driver/node /root/.cache/ms-playwright/playwright-rust/driver/package/lib/cli/cli.js run-driver
root 20265 19554 31 23:08 ? 00:14:57 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --disable-background-networking --enable-features=NetworkService,NetworkServiceInProcess --disable-background-timer-throttling --disable-backgrounding-occluded-windows --disable-breakpad --disable-client-side-phishing-detection --disable-component-extensions-with-background-pages --disable-default-apps --disable-dev-shm-usage --disable-extensions --disable-features=TranslateUI,BlinkGenPropertyTrees,ImprovedCookieControls,SameSiteByDefaultCookies,LazyFrameLoading --allow-pre-commit-input --disable-hang-monitor --disable-ipc-flooding-protection --disable-popup-blocking --disable-prompt-on-repost --disable-renderer-backgrounding --disable-sync --force-color-profile=srgb --metrics-recording-only --no-first-run --enable-automation --password-store=basic --use-mock-keychain --no-service-autorun --user-data-dir=/tmp/playwright_chromiumdev_profile-sJjhND --remote-debugging-pipe --headless --hide-scrollbars --mute-audio --blink-settings=primaryHoverType=2,availableHoverTypes=2,primaryPointerType=4,availablePointerTypes=4 --no-sandbox --no-startup-window
root 20267 20265 0 23:08 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=zygote --no-zygote-sandbox --no-sandbox --headless --headless
root 20268 20265 0 23:08 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=zygote --no-sandbox --headless --headless
root 20286 20268 81 23:08 ? 00:39:02 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=renderer --no-sandbox --disable-dev-shm-usage --disable-background-timer-throttling --disable-breakpad --enable-automation --force-color-profile=srgb --remote-debugging-pipe --allow-pre-commit-input --blink-settings=primaryHoverType=2,availableHoverTypes=2,primaryPointerType=4,availablePointerTypes=4 --ozone-platform=headless --field-trial-handle=5151956782728765150,10929878616647120254,131072 --enable-features=NetworkService,NetworkServiceInProcess --disable-features=BlinkGenPropertyTrees,ImprovedCookieControls,LazyFrameLoading,PaintHolding,SameSiteByDefaultCookies,TranslateUI --disa
root 20321 20267 0 23:08 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=gpu-process --field-trial-handle=5151956782728765150,10929878616647120254,131072 --enable-features=NetworkService,NetworkServiceInProcess --disable-features=BlinkGenPropertyTrees,ImprovedCookieControls,LazyFrameLoading,PaintHolding,SameSiteByDefaultCookies,TranslateUI --no-sandbox --disable-dev-shm-usage --disable-breakpad --headless --ozone-platform=headless --headless --gpu-preferences=UAAAAAAAAAAgAAAQAAAAAAAAAAAAAAAAAABgAAAAAAAwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAgAAAAAAAAACAAAAAAAAAAAAAAAAAAAAAIAAAAAAAAAAgAAAAAAAAACAAAAAAAAAA= --use-gl=disabled --override-use-software-gl-for-tests --shared-files
root 22047 19554 31 23:17 ? 00:12:11 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --disable-background-networking --enable-features=NetworkService,NetworkServiceInProcess --disable-background-timer-throttling --disable-backgrounding-occluded-windows --disable-breakpad --disable-client-side-phishing-detection --disable-component-extensions-with-background-pages --disable-default-apps --disable-dev-shm-usage --disable-extensions --disable-features=TranslateUI,BlinkGenPropertyTrees,ImprovedCookieControls,SameSiteByDefaultCookies,LazyFrameLoading --allow-pre-commit-input --disable-hang-monitor --disable-ipc-flooding-protection --disable-popup-blocking --disable-prompt-on-repost --disable-renderer-backgrounding --disable-sync --force-color-profile=srgb --metrics-recording-only --no-first-run --enable-automation --password-store=basic --use-mock-keychain --no-service-autorun --user-data-dir=/tmp/playwright_chromiumdev_profile-c8EaqE --remote-debugging-pipe --headless --hide-scrollbars --mute-audio --blink-settings=primaryHoverType=2,availableHoverTypes=2,primaryPointerType=4,availablePointerTypes=4 --no-sandbox --no-startup-window
root 22049 22047 0 23:17 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=zygote --no-zygote-sandbox --no-sandbox --headless --headless
root 22050 22047 0 23:17 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=zygote --no-sandbox --headless --headless
root 22067 22050 81 23:17 ? 00:31:12 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=renderer --no-sandbox --disable-dev-shm-usage --disable-background-timer-throttling --disable-breakpad --enable-automation --force-color-profile=srgb --remote-debugging-pipe --allow-pre-commit-input --blink-settings=primaryHoverType=2,availableHoverTypes=2,primaryPointerType=4,availablePointerTypes=4 --ozone-platform=headless --field-trial-handle=3248009288479842313,6307992858770365040,131072 --enable-features=NetworkService,NetworkServiceInProcess --disable-features=BlinkGenPropertyTrees,ImprovedCookieControls,LazyFrameLoading,PaintHolding,SameSiteByDefaultCookies,TranslateUI --disab
root 22097 22049 0 23:17 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=gpu-process --field-trial-handle=3248009288479842313,6307992858770365040,131072 --enable-features=NetworkService,NetworkServiceInProcess --disable-features=BlinkGenPropertyTrees,ImprovedCookieControls,LazyFrameLoading,PaintHolding,SameSiteByDefaultCookies,TranslateUI --no-sandbox --disable-dev-shm-usage --disable-breakpad --headless --ozone-platform=headless --headless --gpu-preferences=UAAAAAAAAAAgAAAQAAAAAAAAAAAAAAAAAABgAAAAAAAwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAgAAAAAAAAACAAAAAAAAAAAAAAAAAAAAAIAAAAAAAAAAgAAAAAAAAACAAAAAAAAAA= --use-gl=disabled --override-use-software-gl-for-tests --shared-files
root 23901 19554 30 23:44 ? 00:03:35 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --disable-background-networking --enable-features=NetworkService,NetworkServiceInProcess --disable-background-timer-throttling --disable-backgrounding-occluded-windows --disable-breakpad --disable-client-side-phishing-detection --disable-component-extensions-with-background-pages --disable-default-apps --disable-dev-shm-usage --disable-extensions --disable-features=TranslateUI,BlinkGenPropertyTrees,ImprovedCookieControls,SameSiteByDefaultCookies,LazyFrameLoading --allow-pre-commit-input --disable-hang-monitor --disable-ipc-flooding-protection --disable-popup-blocking --disable-prompt-on-repost --disable-renderer-backgrounding --disable-sync --force-color-profile=srgb --metrics-recording-only --no-first-run --enable-automation --password-store=basic --use-mock-keychain --no-service-autorun --user-data-dir=/tmp/playwright_chromiumdev_profile-THqdoC --remote-debugging-pipe --headless --hide-scrollbars --mute-audio --blink-settings=primaryHoverType=2,availableHoverTypes=2,primaryPointerType=4,availablePointerTypes=4 --no-sandbox --no-startup-window
root 23903 23901 0 23:44 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=zygote --no-zygote-sandbox --no-sandbox --headless --headless
root 23904 23901 0 23:44 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=zygote --no-sandbox --headless --headless
root 23921 23904 80 23:44 ? 00:09:21 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=renderer --no-sandbox --disable-dev-shm-usage --disable-background-timer-throttling --disable-breakpad --enable-automation --force-color-profile=srgb --remote-debugging-pipe --allow-pre-commit-input --blink-settings=primaryHoverType=2,availableHoverTypes=2,primaryPointerType=4,availablePointerTypes=4 --ozone-platform=headless --field-trial-handle=10776091224041476131,14293814787319111420,131072 --enable-features=NetworkService,NetworkServiceInProcess --disable-features=BlinkGenPropertyTrees,ImprovedCookieControls,LazyFrameLoading,PaintHolding,SameSiteByDefaultCookies,TranslateUI --dis
root 23961 23903 0 23:44 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=gpu-process --field-trial-handle=10776091224041476131,14293814787319111420,131072 --enable-features=NetworkService,NetworkServiceInProcess --disable-features=BlinkGenPropertyTrees,ImprovedCookieControls,LazyFrameLoading,PaintHolding,SameSiteByDefaultCookies,TranslateUI --no-sandbox --disable-dev-shm-usage --disable-breakpad --headless --ozone-platform=headless --headless --gpu-preferences=UAAAAAAAAAAgAAAQAAAAAAAAAAAAAAAAAABgAAAAAAAwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAgAAAAAAAAACAAAAAAAAAAAAAAAAAAAAAIAAAAAAAAAAgAAAAAAAAACAAAAAAAAAA= --use-gl=disabled --override-use-software-gl-for-tests --shared-files
root 24917 19554 24 23:55 ? 00:00:04 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --disable-background-networking --enable-features=NetworkService,NetworkServiceInProcess --disable-background-timer-throttling --disable-backgrounding-occluded-windows --disable-breakpad --disable-client-side-phishing-detection --disable-component-extensions-with-background-pages --disable-default-apps --disable-dev-shm-usage --disable-extensions --disable-features=TranslateUI,BlinkGenPropertyTrees,ImprovedCookieControls,SameSiteByDefaultCookies,LazyFrameLoading --allow-pre-commit-input --disable-hang-monitor --disable-ipc-flooding-protection --disable-popup-blocking --disable-prompt-on-repost --disable-renderer-backgrounding --disable-sync --force-color-profile=srgb --metrics-recording-only --no-first-run --enable-automation --password-store=basic --use-mock-keychain --no-service-autorun --user-data-dir=/tmp/playwright_chromiumdev_profile-K0z8PD --remote-debugging-pipe --headless --hide-scrollbars --mute-audio --blink-settings=primaryHoverType=2,availableHoverTypes=2,primaryPointerType=4,availablePointerTypes=4 --no-sandbox --no-startup-window
root 24919 24917 0 23:55 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=zygote --no-zygote-sandbox --no-sandbox --headless --headless
root 24920 24917 0 23:55 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=zygote --no-sandbox --headless --headless
root 24938 24920 64 23:55 ? 00:00:12 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=renderer --no-sandbox --disable-dev-shm-usage --disable-background-timer-throttling --disable-breakpad --enable-automation --force-color-profile=srgb --remote-debugging-pipe --allow-pre-commit-input --blink-settings=primaryHoverType=2,availableHoverTypes=2,primaryPointerType=4,availablePointerTypes=4 --ozone-platform=headless --field-trial-handle=6371735118124582706,3023160193025494599,131072 --enable-features=NetworkService,NetworkServiceInProcess --disable-features=BlinkGenPropertyTrees,ImprovedCookieControls,LazyFrameLoading,PaintHolding,SameSiteByDefaultCookies,TranslateUI --disab
root 24963 24919 0 23:55 ? 00:00:00 /root/.cache/ms-playwright/chromium-878941/chrome-linux/chrome --type=gpu-process --field-trial-handle=6371735118124582706,3023160193025494599,131072 --enable-features=NetworkService,NetworkServiceInProcess --disable-features=BlinkGenPropertyTrees,ImprovedCookieControls,LazyFrameLoading,PaintHolding,SameSiteByDefaultCookies,TranslateUI --no-sandbox --disable-dev-shm-usage --disable-breakpad --headless --ozone-platform=headless --headless --gpu-preferences=UAAAAAAAAAAgAAAQAAAAAAAAAAAAAAAAAABgAAAAAAAwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAgAAAAAAAAACAAAAAAAAAAAAAAAAAAAAAIAAAAAAAAAAgAAAAAAAAACAAAAAAAAAA= --use-gl=disabled --override-use-software-gl-for-tests --shared-files
can't figure it out what's wrong. Any help would be appreciated.

Related

How to control the count of processes spawned by ng ng build --prod=true to avoid bitbucket pipeline from failing with 'Build' exceeded memory limit

I am doing a memory dump on the instance where ng build is triggered. I can show the time where number of processes are spawned by ng build. Is there way to control this number.
total used free shared buff/cache available
Mem: 30G 8.5G 2.7G 215M 19G 21G
Swap: 0B 0B 0B
Fri Jan 27 15:07:19 UTC 2023
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 0.0 4288 708 ? Ss 15:02 0:00 /bin/sh -c exit $( (/usr/bin/mkfifo /opt/atlassian/pipelines/agent/tmp/build_result && /bin/cat /opt/atlassian/pipelines/agent/tmp/build_result) || /bin/echo 1)
root 8 0.0 0.0 4288 96 ? S 15:02 0:00 /bin/sh -c exit $( (/usr/bin/mkfifo /opt/atlassian/pipelines/agent/tmp/build_result && /bin/cat /opt/atlassian/pipelines/agent/tmp/build_result) || /bin/echo 1)
root 9 0.0 0.0 4200 716 ? S 15:02 0:00 /bin/cat /opt/atlassian/pipelines/agent/tmp/build_result
root 11 0.0 0.0 4288 1460 ? Ss 15:02 0:00 /bin/sh /opt/atlassian/pipelines/agent/tmp/wrapperScript14257846929627798257.sh
root 35 0.0 0.0 4288 764 ? S 15:02 0:00 /bin/sh /opt/atlassian/pipelines/agent/tmp/buildScript1578831321044327918.sh
root 36 0.0 0.0 18004 2936 ? S 15:02 0:00 /bin/bash -i /opt/atlassian/pipelines/agent/tmp/bashScript16516797543053797206.sh
root 37 0.0 0.0 18008 2416 ? S 15:02 0:00 /bin/bash -i /opt/atlassian/pipelines/agent/tmp/bashScript16516797543053797206.sh
root 38 0.0 0.0 18008 2416 ? S 15:02 0:00 /bin/bash -i /opt/atlassian/pipelines/agent/tmp/bashScript16516797543053797206.sh
root 40 0.0 0.0 18024 2924 ? S 15:02 0:00 bash docker-build.sh
root 60 0.0 0.1 666880 41336 ? Sl 15:02 0:00 npm
root 71 0.0 0.0 4296 808 ? S 15:02 0:00 sh -c npm i --unsafe-perm -g #angular/cli && npm i && npm run copyi18n && npm run bump_version && node --max_old_space_size=6656 node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 327 125 10.3 4375844 3362764 ? Rl 15:03 5:00 ng build --prod=true --base-href=/
root 574 0.0 0.0 4196 680 ? S 15:07 0:00 sleep 5
root 576 0.0 0.0 36644 2844 ? R 15:07 0:00 ps aux
In the above there is PID - 317 and only one ng build instance
After 10 second i see the following
Fri Jan 27 15:07:29 UTC 2023
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 0.0 4288 708 ? Ss 15:02 0:00 /bin/sh -c exit $( (/usr/bin/mkfifo /opt/atlassian/pipelines/agent/tmp/build_result && /bin/cat /opt/atlassian/pipelines/agent/tmp/build_result) || /bin/echo 1)
root 8 0.0 0.0 4288 96 ? S 15:02 0:00 /bin/sh -c exit $( (/usr/bin/mkfifo /opt/atlassian/pipelines/agent/tmp/build_result && /bin/cat /opt/atlassian/pipelines/agent/tmp/build_result) || /bin/echo 1)
root 9 0.0 0.0 4200 716 ? S 15:02 0:00 /bin/cat /opt/atlassian/pipelines/agent/tmp/build_result
root 11 0.0 0.0 4288 1460 ? Ss 15:02 0:00 /bin/sh /opt/atlassian/pipelines/agent/tmp/wrapperScript14257846929627798257.sh
root 35 0.0 0.0 4288 764 ? S 15:02 0:00 /bin/sh /opt/atlassian/pipelines/agent/tmp/buildScript1578831321044327918.sh
root 36 0.0 0.0 18004 2936 ? S 15:02 0:00 /bin/bash -i /opt/atlassian/pipelines/agent/tmp/bashScript16516797543053797206.sh
root 37 0.0 0.0 18008 2416 ? S 15:02 0:00 /bin/bash -i /opt/atlassian/pipelines/agent/tmp/bashScript16516797543053797206.sh
root 38 0.0 0.0 18008 2416 ? S 15:02 0:00 /bin/bash -i /opt/atlassian/pipelines/agent/tmp/bashScript16516797543053797206.sh
root 40 0.0 0.0 18024 2924 ? S 15:02 0:00 bash docker-build.sh
root 60 0.0 0.1 666880 41336 ? Sl 15:02 0:00 npm
root 71 0.0 0.0 4296 808 ? S 15:02 0:00 sh -c npm i --unsafe-perm -g #angular/cli && npm i && npm run copyi18n && npm run bump_version && node --max_old_space_size=6656 node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 327 123 12.5 5146756 4055172 ? Sl 15:03 5:07 ng build --prod=true --base-href=/
root 578 40.1 0.2 611084 90388 ? Rl 15:07 0:02 /usr/local/bin/node --max_old_space_size=6656 /opt/atlassian/pipelines/agent/build/node_modules/worker-farm/lib/child/index.js /usr/local/bin/node /opt/atlassian/pipelines/agent/build/node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 585 86.2 0.3 651124 129216 ? Rl 15:07 0:04 /usr/local/bin/node --max_old_space_size=6656 /opt/atlassian/pipelines/agent/build/node_modules/worker-farm/lib/child/index.js /usr/local/bin/node /opt/atlassian/pipelines/agent/build/node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 596 25.6 0.3 638388 117496 ? Rl 15:07 0:01 /usr/local/bin/node --max_old_space_size=6656 /opt/atlassian/pipelines/agent/build/node_modules/worker-farm/lib/child/index.js /usr/local/bin/node /opt/atlassian/pipelines/agent/build/node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 604 29.0 0.2 609828 88928 ? Rl 15:07 0:01 /usr/local/bin/node --max_old_space_size=6656 /opt/atlassian/pipelines/agent/build/node_modules/worker-farm/lib/child/index.js /usr/local/bin/node /opt/atlassian/pipelines/agent/build/node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 611 102 0.2 617016 95864 ? Rl 15:07 0:04 /usr/local/bin/node --max_old_space_size=6656 /opt/atlassian/pipelines/agent/build/node_modules/worker-farm/lib/child/index.js /usr/local/bin/node /opt/atlassian/pipelines/agent/build/node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 618 32.7 0.3 620824 99604 ? Rl 15:07 0:01 /usr/local/bin/node --max_old_space_size=6656 /opt/atlassian/pipelines/agent/build/node_modules/worker-farm/lib/child/index.js /usr/local/bin/node /opt/atlassian/pipelines/agent/build/node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 625 66.2 0.4 657444 136912 ? Rl 15:07 0:02 /usr/local/bin/node --max_old_space_size=6656 /opt/atlassian/pipelines/agent/build/node_modules/worker-farm/lib/child/index.js /usr/local/bin/node /opt/atlassian/pipelines/agent/build/node_modules/#angular/cli/bin/ng build --prod=true --base-href=/
root 633 0.0 0.0 4196 652 ? S 15:07 0:00 sleep 5
root 635 0.0 0.0 36644 2804 ? R 15:07 0:00 ps aux
After this point the available RAM Drops and then eventually fails
total used free shared buff/cache available
Mem: 30G 11G 762M 169M 18G 19G
Swap: 0B 0B 0B
Bitbucket Pipeline File
branches:
master:
- step:
size: 2x
script:
- bash docker-build.sh
'{dev/*,release/*,hotfix/*}':
- step:
size: 2x
script:
- while true; do date && ps aux && echo "" && sleep 5; done &
- while true; do free -h && echo "" && sleep 5; done &
- bash docker-build.sh
definitions:
services:
node:
image: node:10.15.3
memory: 7680
docker:
memory: 512
# Docker true for running docker daemon commands. By default it will be there in step
options:
docker: true
size: 2x
package.json snippet
{
"name": "de-ui",
"version": "4.7.1",
"scripts": {
"ng": "ng",
"start": "node --max_old_space_size=8192 node_modules/#angular/cli/bin/ng serve",
"build": "ng build",
"test": "ng test",
"lint": "ng lint",
"e2e": "ng e2e",
"prod_build": "npm i --unsafe-perm -g #angular/cli && npm i && npm run copyi18n && npm run bump_version && node --max_old_space_size=6656 node_modules/#angular/cli/bin/ng build --prod=true --base-href=/",
"copyi18n": "node ./load.po.files.js ./src/assets/i18n/po/ ./src/assets/i18n/",
"createi18npo": "node ./load.po.files.js ./src/assets/i18n/ ./src/assets/i18n/po/",
"update_de": "npm update #de/de-ui-core #de/de-jsf-form #de/de-ui-app #de/de-ui-api",
"bump_version": "node ./bump_version.js",
"serve_prod": "node --max_old_space_size=8192 node_modules/#angular/cli/bin/ng serve --prod=true"
}
Note:
Size: 2x -- 8 GB Available
docker - Set to true
Sizing:
node:
image: node:10.15.3
memory: 7680
docker:
memory: 512
node --max_old_space_size=6656 -- provided in ng build
1. How can i avoid so many process from getting triggered
2. Is there way i can re arrange the memory allocation to avoid getting --> Container 'Build' exceeded memory limit.
Have tried changing the memory sizing. But not able to get it.
Thinking if i the number of processes that is getting spawned can be controlled, then can handle memory issue.

Cannot launch Playwright in Ubuntu Docker image with .NET

I've created service with .Net 6.0 using Playwright v1.19.1 and built an image on Ubuntu 20.04 as below:
sdk:6.0-focal AS build
runtime:6.0-focal AS runtime.
In the code I set up to launch the Chromium browser
await playwright.Chromium.LaunchAsync(new BrowserTypeLaunchOptions
{
Headless = false,
Channel = "chrome",
Args = new [] { "--disable-dev-shm-usage"}
});
Below is Docker file:
FROM mcr.microsoft.com/dotnet/sdk:6.0-focal AS build
COPY . /app
WORKDIR /app
ENV PLAYWRIGHT_BROWSERS_PATH=/app/playwright
RUN dotnet restore "src/Service/Service.csproj"
RUN dotnet build "src/Service/Service.csproj" -c Release
RUN pwsh src/Service/bin/Release/net6.0/playwright.ps1 install chrome
RUN dotnet publish "src/Service/Service.csproj" -c Release -o /app/output
# =======================================================================================================
# Runtime
# =======================================================================================================
FROM mcr.microsoft.com/dotnet/runtime:6.0-focal AS runtime
EXPOSE 80
WORKDIR /home/site/wwwroot
COPY --from=build /app/output .
COPY --from=build /app/playwright .playwright/ms-playwright
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true \
PLAYWRIGHT_BROWSERS_PATH=/home/site/wwwroot/.playwright/ms-playwright
The image was built successful but when I start the image got below error. Do you know how to solve this issue? Thank you
Microsoft.Playwright.PlaywrightException: Browser closed.
==================== Browser output: ====================
<launching> /opt/google/chrome/chrome --disable-background-networking
--enable-features=NetworkService,NetworkServiceInProcess
--disable-background-timer-throttling --disable-backgrounding-occluded-windows
--disable-breakpad --disable-client-side-phishing-detection
--disable-component-extensions-with-background-pages
--disable-default-apps --disable-dev-shm-usage --disable-extensions
--disable-features=ImprovedCookieControls,LazyFrameLoading,GlobalMediaControls,
DestroyProfileOnBrowserClose,MediaRouter,AcceptCHFrame,AutoExpandDetailsElement
--allow-pre-commit-input --disable-hang-monitor --disable-ipc-flooding-protection
--disable-popup-blocking --disable-prompt-on-repost --disable-renderer-backgrounding
--disable-sync --force-color-profile=srgb --metrics-recording-only --no-first-run
--enable-automation --password-store=basic --use-mock-keychain
--no-service-autorun--export-tagged-pdf --no-sandbox --disable-dev-shm-usage
--user-data-dir=/tmp/playwright_chromiumdev_profile-HPOMeC
--remote-debugging-pipe --no-startup-window
<launched> pid=85
[pid=85][err] [85:85:0330/032427.114118:ERROR:ozone_platform_x11.cc(247)] Missing X server or $DISPLAY
[pid=85][err] [85:85:0330/032427.114193:ERROR:env.cc(225)] The platform failed to initialize. Exiting.
You need inside your container to set up a DISPLAY env parameter
so you will need to do in the container:
export DISPLAY='{IP}:0'
also start your browser with 'Headless = true',
also add --disable-gpu flag
if you want to run browser with GUI, remove the gpu flag and start your browser with 'Headless = false'

execution of chown(container files) get hung

I'm trying to run Zalenium docker on ubuntu 18.04. Below is my command.
curl -sSL https://raw.githubusercontent.com/dosel/t/i/p | bash -s start
or
docker run --rm -ti -p 4404:4444 -v /var/run/docker.sock:/var/run/docker.sock -v /home/emiteqa/eMite/videos:/home/seluser/videos --privileged dosel/zalenium start
I got stuck at "Copying files for Dashboard"
Docker binary already present, will use that one.
Docker version 18.09.0, build 4d60db4
-- LOG 07:33:35:233537096 Ensuring docker works...
-- LOG 07:33:35:414698248 Ensuring docker-selenium is available...
haveged: haveged starting up
Copying files for Dashboard...
After investigate, I find the cause is the zalenium get hung at below line in zalenium.sh
sudo chown -R ${HOST_UID}:${HOST_GID} /home/seluser
Then, I go into the zalenium container. I try to run sudo chown on any file in the container, it get stuck.
seluser#5ada56c01231:~$ ls -ltr
total 40396
-rwxrwxr-x 1 seluser root 41277 Nov 24 15:23 zalenium.sh
-rwxrw-r-- 1 seluser root 15086 Nov 24 15:23 zalando.ico
-rwxrwxr-x 1 seluser root 770 Nov 24 15:23 wait-testingbot.sh
-rwxrwxr-x 1 seluser root 983 Nov 24 15:23 wait-saucelabs.sh
-rwxrwxr-x 1 seluser root 933 Nov 24 15:23 wait-lambdatest.sh
-rwxrwxr-x 1 seluser root 746 Nov 24 15:23 wait-cbt.sh
-rwxrwxr-x 1 seluser root 831 Nov 24 15:23 wait-browserstack.sh
-rwxrwxr-x 1 seluser root 1268 Nov 24 15:23 start-testingbot.sh
-rwxrwxr-x 1 seluser root 2679 Nov 24 15:23 start-saucelabs.sh
-rwxrwxr-x 1 seluser root 1061 Nov 24 15:23 start-lambdatest.sh
-rwxrwxr-x 1 seluser root 1078 Nov 24 15:23 start-cbt.sh
-rwxrwxr-x 1 seluser root 1287 Nov 24 15:23 start-browserstack.sh
-rwxrw-r-- 1 seluser root 2699 Nov 24 15:23 logging_info.properties
-rwxrw-r-- 1 seluser root 2738 Nov 24 15:23 logging_debug.properties
-rwxrw-r-- 1 seluser root 1082 Nov 24 15:23 logback.xml
-rwxrw-r-- 1 seluser root 2128 Nov 24 15:23 LICENSE.md
-rwxrw-r-- 1 seluser root 637 Nov 24 15:23 error.html.bak
-rwxrw-r-- 1 seluser root 10996 Nov 24 15:23 dashboard_template.html
-rwxrw-r-- 1 seluser root 4529 Nov 24 15:23 Analytics.md
-rw-rw-r-- 1 root root 41184199 Nov 24 15:23 zalenium-3.141.59v.jar
drwxrwxr-x 1 seluser root 4096 Nov 24 15:27 css
drwxrwxr-x 1 seluser root 4096 Nov 24 15:27 js
drwxrwxr-x 1 seluser root 4096 Nov 24 15:27 img
drwxrwxr-x 5 seluser seluser 4096 Dec 5 01:56 videos
-rwxrw-r-- 1 seluser seluser 3420 Dec 6 07:00 nginx.conf.bak
-rwxrw-r-- 1 seluser seluser 3404 Dec 6 07:00 nginx.conf
-rwxrw-r-- 1 seluser seluser 627 Dec 6 07:00 error.html
drwxr-xr-x 2 seluser seluser 4096 Dec 6 07:00 logs
-rw-r--r-- 1 seluser seluser 1181 Dec 6 07:00 docker_info.txt
seluser#5ada56c01231:~$ pwd
/home/seluser
seluser#5ada56c01231:~$ sudo chown seluser:seluser wait-testingbot.sh
^C^C
seluser#5ada56c01231:~$ ^C
seluser#5ada56c01231:~$ sudo chown seluser:seluser wait-testingbot.sh
seluser#5ada56c01231:~$ sudo chown seluser:seluser wait-saucelabs.sh
^C^C
seluser#5ada56c01231:~$ sudo chown seluser:seluser wait-saucelabs.sh
seluser#5ada56c01231:~$
What I have tried
If I touch a new file and run sudo chown, there is no problem.
If I CTRL+C and retry, it will pass.
By using exact same steps starting from docker installation, I can run zalenium successfully on aws
server.
I googled, but didn't find any similar issue. I know it's possibly due to env issue, but I have no clue how to troubleshoot.
Thanks in advance for any suggestion and help.

Dockerfile-dev vs Dockerdev-prod

This is my Docker project structure:
├── docker-compose-dev.yml
├── docker-compose-prod.yml
└── services
├── client
│ ├── Dockerfile-dev
│ ├── Dockerfile-prod
├── nginx
│ ├── Dockerfile-dev
│ ├── Dockerfile-prod
│ ├── dev.conf
│ └── prod.conf
└── web
├── Dockerfile-dev <----- THIS
├── Dockerfile-prod <----- THIS
├── entrypoint-prod.sh
├── entrypoint.sh
├── htmlcov
├── manage.py
├── project
│ ├── __init__.py
│ ├── api
│ │ ├── __init__.py
│ │ ├── models.py
│ │ ├── templates
│ │ │ └── index.html
│ │ └── users.py
│ ├── config.py
│ ├── db
│ │ ├── Dockerfile
│ │ └── create.sql
└── requirements.txt
At development stage, docker images for my project have been successfully created with:
$ docker-compose -f docker-compose-dev.yml up --build
Dockerfile-dev on "web" service
# base image
FROM python:3.6-alpine
# install dependencies
RUN apk update && \
apk add --virtual build-deps gcc python-dev musl-dev && \
apk add libffi-dev && \
apk add postgresql-dev && \
apk add netcat-openbsd && \
apk add bind-tools && \
apk add --update --no-cache g++ libxslt-dev && \
apk add jpeg-dev zlib-dev
ENV PACKAGES="\
dumb-init \
musl \
libc6-compat \
linux-headers \
build-base \
bash \
git \
ca-certificates \
freetype \
libgfortran \
libgcc \
libstdc++ \
openblas \
tcl \
tk \
libssl1.0 \
"
ENV PYTHON_PACKAGES="\
numpy \
matplotlib \
scipy \
scikit-learn \
nltk \
"
RUN apk add --no-cache --virtual build-dependencies python3 \
&& apk add --virtual build-runtime \
build-base python3-dev openblas-dev freetype-dev pkgconfig gfortran \
&& ln -s /usr/include/locale.h /usr/include/xlocale.h \
&& python3 -m ensurepip \
&& rm -r /usr/lib/python*/ensurepip \
&& pip3 install --upgrade pip setuptools \
&& ln -sf /usr/bin/python3 /usr/bin/python \
&& ln -sf pip3 /usr/bin/pip \
&& rm -r /root/.cache \
&& pip install --no-cache-dir $PYTHON_PACKAGES \
&& pip3 install 'pandas<0.21.0' \
&& apk del build-runtime \
&& apk add --no-cache --virtual build-dependencies $PACKAGES \
&& rm -rf /var/cache/apk/*
# set working directory
WORKDIR /usr/src/app
# add and install requirements
COPY ./requirements.txt /usr/src/app/requirements.txt
RUN pip install -r requirements.txt
# add entrypoint.sh
COPY ./entrypoint.sh /usr/src/app/entrypoint.sh
RUN chmod +x /usr/src/app/entrypoint.sh
# add app
COPY . /usr/src/app
# run server
CMD ["/usr/src/app/entrypoint.sh"]
docker ps -as shows me the project up and running at localhost:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES SIZE
f88c27e5334f dev3_nginx "nginx -g 'daemon of…" 14 hours ago Up 14 hours 0.0.0.0:80->80/tcp dev3_nginx_1 2B (virtual 16.1MB)
f77eb7949fef dev3_client "npm start" 14 hours ago Up 14 hours 0.0.0.0:3007->3000/tcp dev3_client_1 55B (virtual 553MB)
33b1b50931a6 dev3_web "/usr/src/app/entryp…" 14 hours ago Up 14 hours 0.0.0.0:5001->5000/tcp dev3_web_1 35.3kB (virtual 3.32GB)
0e28363ab85a dev3_web-db "docker-entrypoint.s…" 3 days ago Up 14 hours 0.0.0.0:5435->5432/tcp dev3_web-db_1 63B (virtual 71.7MB)
But I can't build my production images with:
$ docker-compose -f docker-compose-prod.yml up --build
Dockerfile-prod on "web" service
(...the same as Dockerfile-dev from top to here)
# set working directory
WORKDIR /usr/src/app
# add and install requirements
COPY ./requirements.txt /usr/src/app/requirements.txt
RUN pip install -r requirements.txt
# new
# add entrypoint.sh
COPY ./entrypoint.sh /usr/src/app/entrypoint-prod.sh
RUN chmod +x /usr/src/app/entrypoint-prod.sh
# add app
COPY . /usr/src/app
# new
# run server
CMD ["/usr/src/app/entrypoint-prod.sh"]
at production build hangs after numpy was installed, and it never resolves.
(...)
Collecting pip
Downloading https://files.pythonhosted.org/packages/d8/f3/413bab4ff08e1fc4828dfc59996d721917df8e8583ea85385d51125dceff/pip-19.0.3-py2.py3-none-any.whl (1.4MB)
Requirement already up-to-date: setuptools in /usr/local/lib/python3.6/site-packages (40.8.0)
Installing collected packages: pip
Found existing installation: pip 19.0.2
Uninstalling pip-19.0.2:
Successfully uninstalled pip-19.0.2
Successfully installed pip-19.0.3
Collecting numpy
Downloading https://files.pythonhosted.org/packages/2b/26/07472b0de91851b6656cbc86e2f0d5d3a3128e7580f23295ef58b6862d6c/numpy-1.16.1.zip (5.1MB)
Collecting matplotlib
Downloading https://files.pythonhosted.org/packages/89/0c/653aec68e9cfb775c4fbae8f71011206e5e7fe4d60fcf01ea1a9d3bc957f/matplotlib-3.0.2.tar.gz (36.5MB)
# HANGS HERE ˆˆˆˆˆ
The problem does not seem to be matplotlib because if I remove this package it hangs at scipy, the next after numpy, and so on...
NOTE: I am trying to build production not at localhost but rather in a docker-machine.
$ docker-machine ls
NAME ACTIVE DRIVER STATE URL SWARM DOCKER ERRORS
testdriven-dev - virtualbox Running tcp://192.168.99.100:2376 v18.09.1
testdriven-prod * amazonec2 Running tcp://18.234.200.115:2376 v18.09.1 <------ THIS ONE
with:
$ docker-machine env testdriven-dev
$ eval $(docker-machine env testdriven-prod)
$ export REACT_APP_WEB_SERVICE_URL=http://18.234.200.115
$ docker-compose -f docker-compose-prod.yml up -d --build
env was pruned against any dangling images.
Why is this happening?
Edit
Following advice down on comments, I SSHed into docker-machine to check CPU use during build and, at the point of hanging, this is what I get:
$ docker-machine ssh testdriven-prod free
total used free shared buff/cache available
Mem: 1014540 136296 632136 10712 246108 692056
Swap: 0 0 0
and:
$ docker-machine ssh testdriven-prod df -h
Filesystem Size Used Avail Use% Mounted on
udev 488M 0 488M 0% /dev
tmpfs 100M 11M 89M 11% /run
/dev/xvda1 16G 2.2G 14G 14% /
tmpfs 496M 0 496M 0% /dev/shm
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 496M 0 496M 0% /sys/fs/cgroup
tmpfs 100M 0 100M 0% /run/user/1000
SSH:
top - 01:34:52 up 18 days, 23:47, 1 user, load average: 0.00, 0.00, 0.00
Tasks: 109 total, 1 running, 108 sleeping, 0 stopped, 0 zombie
%Cpu(s): 0.3 us, 0.0 sy, 0.0 ni, 99.7 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st
KiB Mem : 1014540 total, 594752 free, 124092 used, 295696 buff/cache
KiB Swap: 0 total, 0 free, 0 used. 698272 avail Mem
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
8363 root 20 0 435240 21768 3124 S 0.3 2.1 30:07.40 containerd
1 root 20 0 185312 4916 2996 S 0.0 0.5 0:12.98 systemd
more:
ubuntu#testdriven-prod:~$ ps aux -Hww
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 2 0.0 0.0 0 0 ? S Feb03 0:00 [kthreadd]
root 3 0.0 0.0 0 0 ? S Feb03 0:37 [ksoftirqd/0]
root 5 0.0 0.0 0 0 ? S< Feb03 0:00 [kworker/0:0H]
root 7 0.0 0.0 0 0 ? S Feb03 0:17 [rcu_sched]
root 8 0.0 0.0 0 0 ? S Feb03 0:00 [rcu_bh]
root 9 0.0 0.0 0 0 ? S Feb03 0:00 [migration/0]
root 10 0.0 0.0 0 0 ? S Feb03 0:07 [watchdog/0]
root 11 0.0 0.0 0 0 ? S Feb03 0:00 [kdevtmpfs]
root 12 0.0 0.0 0 0 ? S< Feb03 0:00 [netns]
root 13 0.0 0.0 0 0 ? S< Feb03 0:00 [perf]
root 14 0.0 0.0 0 0 ? S Feb03 0:00 [xenwatch]
root 15 0.0 0.0 0 0 ? S Feb03 0:00 [xenbus]
root 17 0.0 0.0 0 0 ? S Feb03 0:00 [khungtaskd]
root 18 0.0 0.0 0 0 ? S< Feb03 0:00 [writeback]
root 19 0.0 0.0 0 0 ? SN Feb03 0:00 [ksmd]
root 20 0.0 0.0 0 0 ? SN Feb03 0:03 [khugepaged]
root 21 0.0 0.0 0 0 ? S< Feb03 0:00 [crypto]
root 22 0.0 0.0 0 0 ? S< Feb03 0:00 [kintegrityd]
root 23 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 24 0.0 0.0 0 0 ? S< Feb03 0:00 [kblockd]
root 25 0.0 0.0 0 0 ? S< Feb03 0:00 [ata_sff]
root 26 0.0 0.0 0 0 ? S< Feb03 0:00 [md]
root 27 0.0 0.0 0 0 ? S< Feb03 0:00 [devfreq_wq]
root 30 0.0 0.0 0 0 ? S Feb03 0:07 [kswapd0]
root 31 0.0 0.0 0 0 ? S< Feb03 0:00 [vmstat]
root 32 0.0 0.0 0 0 ? S Feb03 0:00 [fsnotify_mark]
root 33 0.0 0.0 0 0 ? S Feb03 0:00 [ecryptfs-kthrea]
root 49 0.0 0.0 0 0 ? S< Feb03 0:00 [kthrotld]
root 50 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 51 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 52 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 53 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 54 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 55 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 56 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 57 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 58 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 59 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 60 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 61 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 62 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 63 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 64 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 65 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 66 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 67 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 68 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 69 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 70 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 71 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 72 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 73 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 74 0.0 0.0 0 0 ? S Feb03 0:00 [scsi_eh_0]
root 75 0.0 0.0 0 0 ? S< Feb03 0:00 [scsi_tmf_0]
root 76 0.0 0.0 0 0 ? S Feb03 0:00 [scsi_eh_1]
root 77 0.0 0.0 0 0 ? S< Feb03 0:00 [scsi_tmf_1]
root 79 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 83 0.0 0.0 0 0 ? S< Feb03 0:00 [ipv6_addrconf]
root 96 0.0 0.0 0 0 ? S< Feb03 0:00 [deferwq]
root 258 0.0 0.0 0 0 ? S< Feb03 0:00 [raid5wq]
root 288 0.0 0.0 0 0 ? S< Feb03 0:00 [bioset]
root 310 0.0 0.0 0 0 ? S Feb03 0:06 [jbd2/xvda1-8]
root 311 0.0 0.0 0 0 ? S< Feb03 0:00 [ext4-rsv-conver]
root 386 0.0 0.0 0 0 ? S< Feb03 0:00 [iscsi_eh]
root 389 0.0 0.0 0 0 ? S< Feb03 0:00 [ib_addr]
root 392 0.0 0.0 0 0 ? S< Feb03 0:00 [ib_mcast]
root 394 0.0 0.0 0 0 ? S< Feb03 0:00 [ib_nl_sa_wq]
root 397 0.0 0.0 0 0 ? S< Feb03 0:00 [ib_cm]
root 398 0.0 0.0 0 0 ? S< Feb03 0:00 [iw_cm_wq]
root 399 0.0 0.0 0 0 ? S< Feb03 0:00 [rdma_cm]
root 411 0.0 0.0 0 0 ? S Feb03 0:00 [kauditd]
root 541 0.0 0.0 0 0 ? S< Feb03 0:02 [kworker/0:1H]
root 23959 0.0 0.0 0 0 ? S< Feb03 0:00 [xfsalloc]
root 23960 0.0 0.0 0 0 ? S< Feb03 0:00 [xfs_mru_cache]
root 12186 0.0 0.0 0 0 ? S Feb21 0:00 [kworker/u30:2]
root 12198 0.0 0.0 0 0 ? S 00:21 0:00 [kworker/u30:1]
root 12219 0.0 0.0 0 0 ? S 00:21 0:00 [kworker/0:0]
root 13607 0.0 0.0 0 0 ? S 02:16 0:00 [kworker/0:2]
root 1 0.0 0.4 185312 5020 ? Ss Feb03 0:13 /lib/systemd/systemd --system --deserialize 27
root 366 0.0 0.2 28352 2212 ? Ss Feb03 0:04 /lib/systemd/systemd-journald
root 437 0.0 0.0 102968 372 ? Ss Feb03 0:00 /sbin/lvmetad -f
root 942 0.0 0.2 16116 2780 ? Ss Feb03 0:00 /sbin/dhclient -1 -v -pf /run/dhclient.eth0.pid -lf /var/lib/dhcp/dhclient.eth0.leases -I -df /var/lib/dhcp/dhclient6.eth0.leases eth0
root 1091 0.0 0.2 26068 2048 ? Ss Feb03 0:01 /usr/sbin/cron -f
daemon 1097 0.0 0.1 26044 1664 ? Ss Feb03 0:00 /usr/sbin/atd -f
message+ 1101 0.0 0.1 42992 1704 ? Ss Feb03 0:01 /usr/bin/dbus-daemon --system --address=systemd: --nofork --nopidfile --systemd-activation
root 1110 0.0 0.3 272944 3360 ? Ssl Feb03 0:20 /usr/lib/accountsservice/accounts-daemon
root 1113 0.0 0.6 636464 7008 ? Ssl Feb03 0:07 /usr/bin/lxcfs /var/lib/lxcfs/
root 1139 0.0 0.2 28616 2452 ? Ss Feb03 0:01 /lib/systemd/systemd-logind
syslog 1140 0.0 0.2 260628 2228 ? Ssl Feb03 0:01 /usr/sbin/rsyslogd -n
root 1151 0.0 0.1 4396 1312 ? Ss Feb03 0:00 /usr/sbin/acpid
root 1157 0.0 0.0 5220 116 ? Ss Feb03 0:38 /sbin/iscsid
root 1158 0.0 0.3 5720 3508 ? S<Ls Feb03 3:04 /sbin/iscsid
root 1172 0.0 0.0 13372 144 ? Ss Feb03 0:00 /sbin/mdadm --monitor --pid-file /run/mdadm/monitor.pid --daemonise --scan --syslog
root 1263 0.0 0.1 12840 1588 ttyS0 Ss+ Feb03 0:00 /sbin/agetty --keep-baud 115200 38400 9600 ttyS0 vt220
root 1266 0.0 0.1 14656 1472 tty1 Ss+ Feb03 0:00 /sbin/agetty --noclear tty1 linux
root 8363 0.1 2.1 435240 21768 ? Ssl Feb03 30:09 /usr/bin/containerd
root 9248 0.0 5.6 583988 57744 ? Ssl Feb03 18:11 /usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --storage-driver overlay2 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=amazonec2
root 24091 0.0 0.1 277088 1652 ? Ssl Feb03 0:00 /usr/lib/policykit-1/polkitd --no-debug
root 1068 0.0 0.2 65512 2616 ? Ss Feb09 0:00 /usr/sbin/sshd -D
root 13498 0.0 0.6 92800 6580 ? Ss 01:32 0:00 sshd: ubuntu [priv]
ubuntu 13560 0.0 0.3 92800 3352 ? S 01:32 0:01 sshd: ubuntu#pts/0
ubuntu 13561 0.0 0.5 21388 5104 pts/0 Ss 01:32 0:00 -bash
ubuntu 13617 0.0 0.3 36228 3332 pts/0 R+ 02:23 0:00 ps aux -Hww
root 31239 0.0 1.5 292584 15836 ? Ssl Feb13 0:20 /usr/lib/snapd/snapd
systemd+ 22714 0.0 0.1 100324 1816 ? Ssl Feb20 0:00 /lib/systemd/systemd-timesyncd
root 23340 0.0 0.2 42124 2484 ? Ss Feb20 0:00 /lib/systemd/systemd-udevd
ubuntu 13500 0.0 0.4 45148 4608 ? Ss 01:32 0:00 /lib/systemd/systemd --user
ubuntu 13505 0.0 0.2 208764 2032 ? S 01:32 0:00 (sd-pam)

Too many rake processes spawning on AWS Beanstalk rails application

I'm deploying a Ruby on Rails application on AWS Beanstalk. The application also needs a sidekiq process for background jobs. There's also a sneakers process running to listen on messages from a RabbitMQ instance.
I created an upstart process for sidekiq using ebextensions from the process outlined here. Using the same outline, I created another upstart process for running the sneakers rake task. All the config files are here in this gist.
The deploy runs fine and I can see the sidekiq and sneaker processes running, but after a few deploys I began seeing a number of rake processes being spawned which are taking up database connections.
[root#ip-XXX ec2-user]# ps aux | grep '[/]opt/rubies/ruby-2.0.0-p648/bin/rake'
webapp 13563 0.0 2.2 1400644 184988 ? Sl 01:41 0:00 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 13866 0.7 2.3 694804 193620 ? Sl 01:42 0:10 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 14029 0.0 2.2 1400912 183700 ? Sl 01:42 0:00 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 14046 0.0 2.2 1400912 183812 ? Sl 01:42 0:00 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 14048 0.0 2.2 1400912 183804 ? Sl 01:42 0:00 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 14073 0.0 2.2 1400912 183712 ? Sl 01:42 0:00 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 14158 0.0 2.2 827056 187972 ? Sl Nov23 4:23 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 19139 0.9 2.3 694744 193388 ? Sl 01:47 0:10 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 19273 0.0 2.2 1400852 183680 ? Sl 01:47 0:00 /opt/rubies/ruby-2.0.0-p648/bin/rake
webapp 19290 0.0 2.2 1400852 183732 ? Sl 01:47 0:00 /opt/rubies/ruby-2.0.0-p648/bin/rake
[root#ip-XXX ec2-user]# ps auxf
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
webapp 14158 0.0 2.2 827056 187972 ? Sl Nov23 4:24 /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 13563 0.0 2.2 1400644 185700 ? Sl 01:41 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 13866 0.4 2.3 694804 193620 ? Sl 01:42 0:11 /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 14029 0.0 2.2 1400912 184412 ? Sl 01:42 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 14046 0.0 2.2 1400912 184372 ? Sl 01:42 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 14048 0.0 2.2 1400912 184516 ? Sl 01:42 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 14073 0.0 2.2 1400912 184540 ? Sl 01:42 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 19139 0.4 2.3 694876 193428 ? Sl 01:47 0:11 /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 19273 0.0 2.2 1400852 184288 ? Sl 01:47 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 19290 0.0 2.2 1400852 184472 ? Sl 01:47 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 19293 0.0 2.2 1400852 184488 ? Sl 01:47 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
webapp 19333 0.0 2.2 1400852 184420 ? Sl 01:47 0:00 \_ /opt/rubies/ruby-2.0.0-p648/bin/ rake
root 21038 0.0 0.0 217276 3460 ? Ssl 01:55 0:00 PassengerWatchdog
webapp 21041 0.1 0.0 704036 5652 ? Sl 01:55 0:02 \_ PassengerHelperAgent
webapp 21047 0.0 0.0 243944 7840 ? Sl 01:55 0:00 \_ PassengerLoggingAgent
root 21056 0.0 0.0 56404 1016 ? Ss 01:55 0:00 PassengerWebHelper: master process / var/lib/passenger/standalone/4.0.60/webhelper-1.8.1-x86_64-linux/PassengerWebHelper -c /tmp/ passenger-standalone.e022jt/config -p /tmp/passenger-standalone.e022jt/
webapp 21057 0.0 0.0 56812 4436 ? S 01:55 0:00 \_ PassengerWebHelper: worker process
webapp 21058 0.0 0.0 56812 4436 ? S 01:55 0:00 \_ PassengerWebHelper: worker process
root 21063 0.0 0.0 8552 1104 ? Ss 01:55 0:00 /var/lib/passenger/standalone/4.0.60/ support-x86_64-linux/agents/TempDirToucher /tmp/passenger-standalone.e022jt --cleanup --daemonize --pid-file /tmp/passenger-standalone.e022jt/temp_dir_toucher.pid --log-f
root 21078 0.0 0.0 11600 2748 ? Ss 01:55 0:00 /bin/bash
root 21102 0.0 0.0 54764 2556 ? S 01:55 0:00 \_ su -s /bin/bash -c bundle exec sidekiq -L /var/app/current/log/sidekiq.log -P /var/app/support/pids/sidekiq.pid
root 21103 8.1 2.6 1452872 212932 ? Sl 01:55 2:27 \_ sidekiq 4.1.2 current [0 of 25 busy]
root 21118 0.0 0.0 54768 2644 ? Ss 01:55 0:00 su -s /bin/bash -c bundle exec rake sneakers:run >> /var/app/current/log/sneakers.log 2>&1 webapp
webapp 21146 0.0 0.0 9476 2336 ? Ss 01:55 0:00 \_ bash -c bundle exec rake sneakers:run >> /var/app/current/log/sneakers.log 2>&1
webapp 21147 0.6 2.3 693604 193232 ? Sl 01:55 0:11 \_ /opt/rubies/ruby-2.0.0-p648/ bin/rake
webapp 21349 0.0 2.2 1400608 184160 ? Sl 01:55 0:00 \_ /opt/rubies/ ruby-2.0.0-p648/bin/rake
webapp 21411 0.0 2.2 1400608 183812 ? Sl 01:55 0:00 \_ /opt/rubies/ ruby-2.0.0-p648/bin/rake
webapp 21414 0.0 2.2 1400608 183988 ? Sl 01:55 0:00 \_ /opt/rubies/ ruby-2.0.0-p648/bin/rake
webapp 21475 0.0 2.2 1400608 183976 ? Sl 01:55 0:00 \_ /opt/rubies/ ruby-2.0.0-p648/bin/rake
webapp 21720 0.3 3.5 1311928 293968 ? Sl 01:55 0:07 Passenger RackApp: /var/app/current
I'm not sure what spawned these processes (if it was sidekiq or sneakers or passenger). With each deploy the number seems to grow until the postgres connections are maxed out.
Is my beanstalk configuration incorrect? Can anybody help me debug this so I can figure out what's creating these processes?
Looks like every time sneakers rake task was killed, it left behind orphan processes. To rectify I added this as a pre deploy hook:
files:
"/opt/elasticbeanstalk/hooks/appdeploy/pre/04_mute_sneakers.sh":
mode: "000755"
content: |
#!/bin/bash
initctl stop sneakers 2>/dev/null
kill $(ps aux | grep '[/]opt/rubies/ruby-2.0.0-p648/bin/rake' | awk '{print $2}') 2>/dev/null
echo "Killed Sneakers Process"

Resources