MS Graph API: Note Page Results Not Up-To-Date - microsoft-graph-api

When I load pages via
https://graph.microsoft.com/v1.0/me/onenote/sections/{ID}/pages?$top=100&$orderby=createdDateTime%20desc
the results are not up-to-date, even though changes have been synced across other devices.
There should be 4 pages in the section. This is what is in the response:
two existing pages (two are missing)
several other pages that were previously deleted, but when I try to fetch the HTML contents, I get a 404. The pages were deleted via DELETE https://graph.microsoft.com/v1.0/me/onenote/pages/{ID} per the docs
Does it work via the OneNote API?
Per #codeye's suggestion (see comments), I'm trying to access the pages via the one note API. It seems unclear whether this is still possible:
The API is deprecated and scheduled to sunset November 2022.
The docs refer to registering the application in the Microsoft account Developer Center, which appears to be obsolete and points back to the Azure portal
Later in the same doc, they refer to permissions like office.onenote_update that no longer seem to exist i.e. I can't find them in the Azure Portal. I do see permissions with the same name from two places: Microsoft Graph and OneNote - perhaps the permissions have been renamed?
However, after adding One Note ->
Notes.ReadWrite, logging out and reauthenticating, I'm still getting HTTP errors on onenote.com URLs like:
401 from https://www.onenote.com/api/v1.0/me/notes/sections/{ID}/pages?$top=100&$orderby=createdDateTime%20desc (as #codeye suggested below, tried with and without query parameters)
401 from https://www.onenote.com/api/v1.0/notebooks (URL from the docs
404 from https://www.onenote.com/api/v1.0/me/notebooks (same URL as previous, but with me/ segment added

Related

Programatically Retrieve all Office 365 Unified Audit Logs

I would like to programmatically retrieve and process all logs available from the Office 365 Unified Audit Logs for the purpose of forensic investigation. From the front end, these logs are available through the Office 365 Compliance Admin Center.
I have tried the following options to access these logs from a script, with no success:
Microsoft 365 Management API - This contains the correct data, but is of limited usefulness for forensic investigations due to the short 7 day retention period.
Microsoft Graph - This does not contain all the relevant data - you cannot access the Unified Audit Logs directly through Graph, and the usage reports do not cover all items contained in the Audit Logs (e.g. Exchange actions).
Search-UnifiedAuditLog on Exchange Online PowerShell - Microsoft themselves recommend not to use this programmatically, and I've experienced extremely unreliable results and unmanageable rate-limiting when trying to do so.
So is there something I'm missing here, or is there no way to programmatically retrieve all items from the Unified Audit Logs for the entire retention period? (generally 90 days).
As far as I know the only way to do this is to use the Management API on a regular basis and output the results to some solution for long term storage (Azure Log Analytics Workspace comes to mind, or SIEM like Splunk / Graylog). I.e. write a script that retrieves logs for the last week, and run it at least weekly.
I'll explain how to retrieve logs manually and also show a tool which already exists for this at the bottom of the post.
Manually:
1: Enable Audit logging on the tenant if not already enabled
2: Create an App registration in Azure AD and for getting single tenant audit logs choose "Accounts in this organizational directory only (xyz only - Single tenant)"
3: Create a 'secret key' from within the newly created App Registration. Store it somewhere safe as it's only shown once. From the overview page of the App Registration also store the "Tenant ID" and "Application (Client) ID". You will need all three.
4: From within the new App Registration go to "API permissions" and add 'Application type' permissions for: 'ActivityFeed.Read' and 'ActivityFeed.ReadDlp'.
5: For the following steps you will need to start calling the Office API's, for which you will need a bearer token in the header. To obtain this send the following POST request:
URL: https://login.microsoftonline.com/***tenant_ID***/oauth2/token
Headers: "{'Content-Type': 'application/x-www-form-urlencoded'}"
Data: "grant_type=client_credentials&client_id=Application_ID&client_secret=Secret_Key&resource=https://manage.office.com"
You will receive a JSON response which contains 'access_token'. For all the upcoming API calls, use the following header:
"{'Content-Type': 'application/x-www-form-urlencoded', 'Authorization': 'bearer access_token'}"
6: Subscribe to the audit log feeds you would like to retrieve. The following exist: 'Audit.General', 'Audit.AzureActiveDirectory', 'Audit.Exchange', 'Audit.SharePoint', 'DLP.All'. The POST for Exchange for example would look like: "https://manage.office.com/api/v1.0/tenant_ID/activity/feed/subscriptions/start?contentType=Audit.Exchange"
7: You are now ready to start retrieving actual logs. Individual logs live inside content blobs, which live inside pages, which live inside feeds (e.g. the Audit.Exchange feed). Therefore, for each feed you would like to retrieve logs from, you must collect all the content blobs (iterating through the pages of them) and then retrieve the actual content from that blob.
To retrieve a page of content blobs use the following URL (change bolded content to your situation): "https://manage.office.com/api/v1.0/tenant_ID/activity/feed/subscriptions/content?contentType=Audit.Exhange&startTime=2022-04-13T09:42:52&endTime=2022-04-14T08:42:52"
This will give you a JSON response with content blobs inside. In the response header check "NextPageUri"; if it contains a URL, call that URL for the next page of content.
Now that you have content blobs, use them to retrieve the actual logs. Each content blob is a JSON dict, which contains a "contentUri" field. Call that URL to retrieve a JSON response with the actual logs inside.
You can do this in most programming/scripting languages, but for larger amounts of logs you will want to retrieve logs in parallel, or it will take a long time.
With a tool
In case you want to use an existing tool, this one is free, works on Linux and Windows, and supports multiple outputs.

listing Microsoft Teams tabs always returns 404

I've been trying this scenario on a couple of different tenants so far:
use an existing team or create a new one through the UI
add a tab (let's say OneNote) to a channel
query the list tabs endpoint (through graph explorer)
I always get a 404 response. If I replace in my query tabs by messages I get the messages.
In terms of permissions I have the default graph explorer one + Group.Read.All.
Here is the latest request Id I got 2a180611-b637-4aa4-be27-9e42cbb27ab9 on tenant dev2tolead12. (GET https://graph.microsoft.com/beta/teams/7471ee8d-0ed3-4f22-80ee-3b513e42e6ac/channels/19:9a0544b274654ef8ac97761ebd91b471#thread.skype/tabs)
My question: what am I missing for this request to work?
Sorry, we thought we had deployed the tabs API to all tenants, but had actually deployed it to only some of them – this has been fixed.
The endpoint started working in my tenants today. My guess is Microsoft fixed something recently

Microsoft Graph Missing Sites

I'm trying to fetch all the sites under our tenant and some are missings. We recently created a new site (not subsite) and we cannot found it when searching with Microsoft Graph.
Here's the request url that we are using:
https://graph.microsoft.com/v1.0/sites/?search=*
I also tried to use this url from beta docs but it's not working either:
https://graph.microsoft.com/beta/sites?filter=root%20ne%20null&select=siteCollection,webUrl
Though, I can found it with the get by path url:
https://graph.microsoft.com/v1.0/sites/{hostname}:/{site-path}
Does the site need to have a specific setting to appear in the result when searching?
I want to be able to list all the sharepoint sites that exists.
Based on my latest test, the following url not handle same data so different results(Attetion to my code comment):
//just return the team site(subsite of teams/sites)
https://graph.microsoft.com/v1.0/sites/{hostname}:/{site-path}
//Will return the tenant site collection
https://graph.microsoft.com/beta/sites/root?select=siteCollection,webUrl&filter=root%20ne%20null
To get the modern team sites for the groups, we need to use groups end point from Microsoft Graph. No API to list all the modern team sites directly, we can loop through the groups and then get the site URL with this API: https://graph.microsoft.com/v1.0/groups/<group-id>/drive/root/webUrl.
Please check if the site in your case with issue is a modern team sites?
Some reference from 3rd blog:
https://www.eliostruyf.com/get-the-site-url-of-an-office-365-group-via-the-microsoft-graph/

/me/activities request Microsoft Graph Explorer successful 200 response but empty "value":[ ]

I am trying to retrieve the list of entries that can be seen in the Windows 10 timeline view via the User Activities API in Microsoft Graph.
I have selected the UserActivity.ReadWrite.CreatedByApp permission and am getting an HTTP 200 Success response, but the value returned is empty. I am signed in with a Work Account (O365)
https://graph.microsoft.com/beta/me/activities/recent?$top=5
When I look in the timeline view in Windows 10, I see a comprehensive history of activity. The machine is signed in using the same Work Account.
Any help greatly appreciated.
According to the documentation, only activities created by the current AppID will be returned:
The UserActivity.ReadWrite.CreatedByApp permission will also apply extra filtering to the response, so that only activities created by your application are returned. This server-side filtering might result in empty pages if the user is particularly active and other applications have created more recent activities. To get your application's activities, use the nextLink property to paginate.
I'd suggest dropping the $top parameter and following the nextLink URIs to see if your app's activities show up deeper down the stack.

Google Analytics refresh query

My newly relocated (and slimmed down) site provides links with index.html (rather than the folder name followed simply by \) where they are required.
Google Analytics has not picked up on this and still reports 403 and 404 errors, which no longer apply.
Do I have to anything else or should I just wait for Google Analytics to refresh? If so, how long would this typically take?
I should mention that I have checked the site with a dead links checker and all seems OK.
Google Analytics doesn't crawl your site and will never pickup on that. Older reports will contain the older urls forever.
Newer reports will show the new urls as you start to track them.

Resources