I'm trying out a few services that provide GPUs for model training. I'm currently using datacrunch.io. I'm trying to figure out how to upload files from my local machine to the datacrunch instance. On their website, they say that "You can use any program that supports SFTP; WinSCP or Filezilla are popular options.". So, I provided a ssh key while creating the instance, and trying to transfer files using WinSCP by providing the IP of the instance as the host. It seems to be asking me for a username and password, which I have no idea of. If anyone has used datacrunch.io, or similar services, can they please help me out?
In WinSCP you need to link to your SSH private keyfile as well since authentication is only possible via SSH key. The password field must be left blank;
Check this example
Related
I am using Apache Airflow 2.2.3 with Python 3.9 and run everything in docker containers.
When I add connections to airflow I do it via the GUI because this way the passwords were supposed to be encrypted. In order for the encryption to work I installed the python package "apache-airflow[crypto]" on my local machine and generated a Fernet Key that I then put in my docker-compose.yaml as the variable "AIRFLOW__CORE__FERNET_KEY: 'MY_KEY'".
I also added the package "apache-airflow[crypto]" to my airflow repositories requirements.txt so that airflow can handle fernet keys.
My questions are the following:
When I add the fernet key as an environment variable as described, I can see the fernet key in the docker-compose.yaml and also when I enter the container and use os.environ["AIRFLOW__CORE__FERNET_KEY"] it's shown - isn't that unsafe? As far as I understand it credentials can be decrypted using this fernet key.
When I add connections to airflow I can get their properties via the container CLI by using "airflow connections get CONNECTION_NAME". Although I added the Fernet Key I see the password in plain text here - isn't that supposed to be hidden?
Unlike passwords the values (/connection strings) in the GUI's "Extra" field do not disappear and are even readable in the GUI. How can I hide those credentials from the GUI and from the CLI?
The airflow GUI tells me that my connections are encrypted so I think that the encryption did work somehow. But what is meant by that statement though when I can clearly see the passwords?
I think you make wrong assumptions about "encryption" and "security". The assumptions that you can prevent user who have access to running software (which airflow CLI gives you) are unrealistic and is not really "physically achievable".
Fernet key is used to encrypt data "At rest" in the database. If your database content is stolen (but not your Airflow program/configuration) - your data is protected. This is the ONLY reason for Fernet Key. It protect your data stored in the database "at rest". But once you have the key (from Airflow runtime) you can decrypt it. Usually the database is in some remote server and it has some backups. As long as the backups are not kept together with the key, if your airflow instances is "safe" but your database or backup gets "stolen" no-one will be able to use that data.
Yes. If you have access to airflow running instance you are supposed to be able to read passwords in clear text. How else do you expect Airflow to work? It needs to read the passwords to authenticate. If you can run airflow program, the data needs to be accessible. There is no work around it and you cannot do it differently this is impossible by design. What you CAN do to protect your data better - you can use Secrets Managers https://airflow.apache.org/docs/apache-airflow/stable/security/secrets/secrets-backend/index.html - but they give you at most possibility of frequent rotation of secrets. Airflow - when running needs to have access to those passwords, otherwise you would not be able to well - authenticate. And once you have access to Airflow Runtime (for example with CLI) there is no way to prevent accessing those passwords that airflow has to know at runtime. This is basic property of any system that needs to authenticate with external system and is accessible at runtime. Airflow is written in Python and you can easily write any code that uses its runtime, so there is no way you can physically protect the runtime passwords that need to be known to "airflow core". At runtime, it needs to know authentication to connect and communicate with external systems. And once you have access to the system, you have - by definition - access to all secrets that system uses at runtime. There is no system in the world that can do it differently - that's just the nature of it. Frequent rotation and temporary authentication is the only way to deal with it so that potentially leaked authentication is not used for a long time.
Modern Airflow (2.1 + I believe) has secret masker that masks sensitive data also from extras when you specify it. https://airflow.apache.org/docs/apache-airflow/stable/security/secrets/mask-sensitive-values.html. The secret masker also masks sensitive data in logs, because logs can also be archived and backed up so - similarly to database - it makes sense to protect it. The UI - unlike CLI (which gives you access to runtime of airflow "Core") is just a front-end and it does not give you access to running core, so there masking sensitive data also makes sense.
I have an Jenkins server named "jenkins" in a remote machine, and I currently use its actual IP address to access it. And I have a domain name to use for my Web server on another machine: www.mysite.com.
Is it possible to configure DNS names to use "jenkins.mysite.com" to access my Jenkins server machine without registering another independent domain name?
Further, I might have another machine to host my wiki, so I would like to access it as "wiki.mysite.com".
Thanks.
Yes, it is not only possible, but extremely common. It is a perfectly ordinary use of DNS. The entity controlling mysite.com can add whatever names they want under it (barring some technical limitations).
The details of what you personally need to do to add those other names will, of course, depend entirely on your environment. It can be anything from editing a zone file or using a web administration interface to talking to a sysadmin.
I currently have a txt file on a Linux install that I need to access from my app and write back to. The app and the Linux server are on the same subnet and I have full control of both machines (permissions). I've thought about SSHing into the machine but this obviously has its security drawbacks sending raw credentials. Does anyone have any suggestions on what framework to use to create a secure tunnel or perhaps an alternative solution?
Write a simple web service in your language of choice php, python, ruby, etc. And make a simple secure call to your service with the changes.
Im struggling to find an answer to this. I have a website that is deployed in a shared hosting environment. I want to allow people to upload files to my azure blob storage account.
I have this working locally, using the storage emulator, however when I publish the site I get a Security Exception.
Is this actually possible under a shared hosting envrionment ?
Cheers
A bit more detail would help, in understanding how these uploads are taking place. That said, I'll make the assumption that people are uploading directly to Blob Storage, and not through your Website (or Web Service).
To allow direct uploads, you need to provide either a public blob or container (which everyone in the world can see), or create a temporary Shared Access Signature (SAS) on a specific blob or container, that grants access for a short time window.
If your app is Silverlight, then you are probably running into a cross-domain issue (and you'll need to correct that with an access policy).
If you provide more details around the way uploads are being sent, as well as the client and server technology, I can edit my answer to be more specific.
I am in the process of creating a Ruby on rails portal
This portal requires a lot of data feeding by the site owner's back-office personnel.
My client has this problem :
the office staff should not be able to access the back office interface from any other than his office computers
I have no idea how to achieve this. Is there a method for this?
Thanks in advance.
Edit:
Is tracking the Mac address a good solution.? is it possible if a ok?
I don't think that you should do anything in rails - this should be configured elsewhere. If rails is running on Apache then see mod_access.
The best way to set this up is to have the app hosted INSIDE the organisation's firewall. Best option - server lives inside the company on a subnet isolated entirely from direct internet access.
If you currently host outside the company, you can set up a firewall that prevents access from unknown IP addresses. You would only accept requests that come from the company's IP ranges. Ideally, you do this at the host/operating system level.
If that can't work, you can do add to your Rails authentication - detect the IP address of the request and if it is not in the company range, prevent access.
found a solution. Using a java applet one can access the machine's hardware details including MAC address. am using this idea.