I have inherited a Swift iOS project with that includes Firebase, and I need to troubleshoot a first login with account creation issue, so I expect it will involve repeatedly removing a single account.
Is it possible to access the databases web frontend of the database from the information included in the GoogleService-Info.plist file that was included in the project? Or might I need some information from the previous developer.
The DATABASE_URL ends in firebaseio.com.
Putting the url from DATABASE_URL into a web browser will expand the URL to something like console.firebase.google/com/u/<number>/project/<root> where:
<number> will be 0, 1, 2, etc. But probably 0.
<root> will be
the project name, which is also the first part of the DATABASE_URL
string in GoogleService-Info.plist
You might find that the URL returns a page that states "There was an error while processing the request. Try Again". This is because you're logged in on a different Google account that cannot access that DB. Logging in with the right account should get you the page you want. (This will also change <number> to a new value which is the one you'll always want to use when accessing this DB)
After that you can navigate to authentication/users to see your created accounts, and you can sort by creation date to easily access the accounts you want to delete.
Related
I would like to programmatically retrieve and process all logs available from the Office 365 Unified Audit Logs for the purpose of forensic investigation. From the front end, these logs are available through the Office 365 Compliance Admin Center.
I have tried the following options to access these logs from a script, with no success:
Microsoft 365 Management API - This contains the correct data, but is of limited usefulness for forensic investigations due to the short 7 day retention period.
Microsoft Graph - This does not contain all the relevant data - you cannot access the Unified Audit Logs directly through Graph, and the usage reports do not cover all items contained in the Audit Logs (e.g. Exchange actions).
Search-UnifiedAuditLog on Exchange Online PowerShell - Microsoft themselves recommend not to use this programmatically, and I've experienced extremely unreliable results and unmanageable rate-limiting when trying to do so.
So is there something I'm missing here, or is there no way to programmatically retrieve all items from the Unified Audit Logs for the entire retention period? (generally 90 days).
As far as I know the only way to do this is to use the Management API on a regular basis and output the results to some solution for long term storage (Azure Log Analytics Workspace comes to mind, or SIEM like Splunk / Graylog). I.e. write a script that retrieves logs for the last week, and run it at least weekly.
I'll explain how to retrieve logs manually and also show a tool which already exists for this at the bottom of the post.
Manually:
1: Enable Audit logging on the tenant if not already enabled
2: Create an App registration in Azure AD and for getting single tenant audit logs choose "Accounts in this organizational directory only (xyz only - Single tenant)"
3: Create a 'secret key' from within the newly created App Registration. Store it somewhere safe as it's only shown once. From the overview page of the App Registration also store the "Tenant ID" and "Application (Client) ID". You will need all three.
4: From within the new App Registration go to "API permissions" and add 'Application type' permissions for: 'ActivityFeed.Read' and 'ActivityFeed.ReadDlp'.
5: For the following steps you will need to start calling the Office API's, for which you will need a bearer token in the header. To obtain this send the following POST request:
URL: https://login.microsoftonline.com/***tenant_ID***/oauth2/token
Headers: "{'Content-Type': 'application/x-www-form-urlencoded'}"
Data: "grant_type=client_credentials&client_id=Application_ID&client_secret=Secret_Key&resource=https://manage.office.com"
You will receive a JSON response which contains 'access_token'. For all the upcoming API calls, use the following header:
"{'Content-Type': 'application/x-www-form-urlencoded', 'Authorization': 'bearer access_token'}"
6: Subscribe to the audit log feeds you would like to retrieve. The following exist: 'Audit.General', 'Audit.AzureActiveDirectory', 'Audit.Exchange', 'Audit.SharePoint', 'DLP.All'. The POST for Exchange for example would look like: "https://manage.office.com/api/v1.0/tenant_ID/activity/feed/subscriptions/start?contentType=Audit.Exchange"
7: You are now ready to start retrieving actual logs. Individual logs live inside content blobs, which live inside pages, which live inside feeds (e.g. the Audit.Exchange feed). Therefore, for each feed you would like to retrieve logs from, you must collect all the content blobs (iterating through the pages of them) and then retrieve the actual content from that blob.
To retrieve a page of content blobs use the following URL (change bolded content to your situation): "https://manage.office.com/api/v1.0/tenant_ID/activity/feed/subscriptions/content?contentType=Audit.Exhange&startTime=2022-04-13T09:42:52&endTime=2022-04-14T08:42:52"
This will give you a JSON response with content blobs inside. In the response header check "NextPageUri"; if it contains a URL, call that URL for the next page of content.
Now that you have content blobs, use them to retrieve the actual logs. Each content blob is a JSON dict, which contains a "contentUri" field. Call that URL to retrieve a JSON response with the actual logs inside.
You can do this in most programming/scripting languages, but for larger amounts of logs you will want to retrieve logs in parallel, or it will take a long time.
With a tool
In case you want to use an existing tool, this one is free, works on Linux and Windows, and supports multiple outputs.
I am in the process of switching the LDAP backend that we use to authenticate access to Gerrit.
When a user logs in via LDAP, a local account is created within Gerrit. We are running version 2.15 of Gerrit, and therefore our local user accounts have migrated from the SQL DB into NoteDB.
The changes in our infrastructure, mean that once the LDAP backend has been switched, user logins will appear to Gerrit as new users and therefore a new local account will be generated. As a result we will need perform a number of administrative tasks to the existing local accounts before and after migration.
The REST API exposes some of the functionality that we need, however two key elements appear to be missing:
There appears to be no way to retrieve a list of all local accounts through the API (such that I could then iterate through to perform the administrative tasks I need to complete). The /accounts/ endpoint insists on a query filter being specified, which does not appear to include a way to simply specify 'all' or '*'. Instead I am having to try and think of a search filter that will reliably return all accounts - I haven't succeeded yet.
There appears to be no way to delete an account. Once the migration is complete, I need to remove the old accounts, but nothing is documented for the API or any other method to remove old accounts.
Has anybody found a solution to either of these tasks that they could share?
I came to the conclusion that the answers to my questions were:
('/a/' in the below examples is accessing the administrative endpoint and so basic Auth is required and the user having appropriate permissions)
Retrieving all accounts
There is no way to do this in a single query, however combining the results of:
GET /a/accounts?q=is:active&n=<number larger than the number of users>
GET /a/accounts?q=is:inactive&n=<number larger than the number of users>
will give effectively the same thing.
Deleting an account
Seems that this simply is not supported. The only option appears to be to set an account inactive:
DELETE /a/accounts/<account_id>/active
Context of what I'm trying to accomplish:
User shares a file with the bot
Other users interact with the bot via a dialog
The bot shares the original file to the other users
For example, we want to share a file to the bot that contains this week's cafeteria menu. Each time users would interact with the bot in a certain way, it would share the cafeteria menu with them so that they can consult it.
I've tried calling files.share method but bots can't perform this action (get invalid token type error).
As far as I can tell, there is no way to do this currently. I've tried link unfurling in the message body but that only works if the file itself was already shared to the user. If not, the link simply won't unfurl and clicking it will fail.
The bot can perform a files.upload call and re-upload the contents of the file to each user individually. This seems incredibly wasteful but appears to be the only way to work currently.
Is there something I'm missing?
The reason your bot can not use file.share is that this is an undocumented API method and you need a legacy token to use it. No other token (user token, bot token) will work, because it requires the post scope, which only exists for legacy token.
Approach A: Legacy Token
So one approach would be to use a legacy token with your bot, which you can create here for your current workspace. That should work nicely if your Slack app is only used on your "own" Slack workspace where you can create and use a legacy token.
Approach B: File Mention
Another approach is to use the mention feature in messages to share a file. This works by sending the private link (url_private property) of an already shared file in a message to a new channel. This will automatically re-share the file in that channel. I believe this only works with files that how been previously shares in a public channel and can therefore be re-shared. Be aware though that the file mention feature is currently being reworked, so this behavior might change.
Example:
https://slack.com/api/chat.postMessage?token=TOKEN&channel=CHANNEL&as_user=true&text=URL_PRIVATE
For more details see the Slack tutorial Storing, retrieving, and modifying file uploads.
Approach C: External File / image file
If you host your file externally or create a public URL for a file uploaded to Slack you can share it in every channel by just adding the URL to a message. Slack will automatically unfurl it and therefore share it to the user in any channel. This is different to Approach B, because its not a file mention and requires a public URL. You get the public URL of an uploaded file by calling files.sharedPublicURL.
If i'm not wrong, you can do like this :
you share a file with your bot
you retrieve the file shared ID, so his url_private property (cf https://api.slack.com/types/file#authentication)
you then donwload the file
you can then re-share it several times later (without re-uploading to each user)...
I am currently working on a solution that is accessing OneDrive in Office 365 using Microsoft Graph. I am using the adal4j library to handle authentication and have configured the app in portal.azure.com.
My question relates the call to get the children for a specified drive. I am using a query similar to the one shown below, as I want to get folders and files at the root level of a specified users drive:
https://graph.microsoft.com/v1.0/users/*user id*/drives/*drive id*/root/children
When I login to the Graph Explorer and execute the query, I get a json result showing the root folder contents for the drive and user specified. All works as expected.
When I call it from my java application, the JSON node value is empty ([]).
Initially my thought was, because the Graph Explorer uses a different app id in the portal it was possibly something to do with access rights. However, I successfully read user profiles in our O365 tenant, the drive id's for each user, and if I execute the following:
https://graph.microsoft.com/v1.0/users/*user id*/drives/*drive id*/root/search(q='')
It provides me a complete list of all of the folders, sub folders etc within the appropriate user's drive.
Therefore, making me think this is a bug with the Graph query I am attempting to use rather than an authorization issue, but, that wouldn't explain why it works in the Graph Explorer.
The same java method is used for all calls, and the url is passed in as a parameter.
Just to follow up, the azure portal app permissions has the capability of adding permissions for the graph api. This was, indeed the problem. It would appear that the search was ignoring the permission and successfully reading the data whereas the /children call was honouring the security model. This caused a lot of confusion, but is now resolved.
Thanks Marc for your help.
Last year I implemented Oauth2 for users of our app to sign in via Google, obtaining the client ID and configuring multiple permitted URLs via the Google Dev Console at https://console.developers.google.com/project/<our-project-id>/apiui/credential?authuser=0.
I now need to add another URL to the list, but the UI for the Google Dev Console has changed. Using the new UI, I don't understand how to view the URLs I already configured, and I don't understand how to add another. The documentation I've found describes adding stuff such as credentials and domains, but I don't want to click anything until I understand it better because I can't risk breaking the production app, which relies on the existing setup I established with the old Google Dev Console. In particular, I don't think I should add or change anything until I can at least see the configuration data I previously created.
I'd appreciate info on how to view my existing configuration data, either by somehow accessing the old UI or by clicking the appropriate controls in the new one. From there I'll hopefully be able to figure out how to add another URL.
By the way, the URL I want to add is a proxy server for using "ionic serve" (http://10.0.0.15:8100/app/oauth_redirect) if that's relevant to your answer.
you can click on Credentials link in the left side menu and then click on you app. shown as hyperlink. and you will get options to add more URL to redirect uri list or to authorized uri list and you will be able to see your old configured url as well. See below screen shots to understand more.