I have TFS2010 and I am wondering, where can I find the global list values within the TFS database. I am trying to find the correct table name and database that contains the values.
Thanks!
We don't support using the operational data store, instead use the API's we provide to you. To start using the API for global list see http://blogs.microsoft.co.il/blogs/shair/archive/2010/03/08/tfs-api-part-23-create-global-list-xml-way.aspx
Related
Using this https://github.com/GoogleCloudPlatform/DataflowTemplates for CDC for MySQL and publishing to google Pub/Sub topic.
In the properties file, there is a provision for whitelistedTables= where you have to give a comma separate list of all the tables you want to monitor for change.
Is there any straightforward way to whitelist an entire database and in turn all tables in it?
Unfortunately the whitelisetedTables parameter does not allow for whitelisting all tables for a given database. However, Dataflow templates are customizable. You can download the code from Github then re-upload your modified version to GCS. Then, you can run your new templated job that allows for this feature. See this prior question: How to Customize GCP Dataflow template?. The code for the Dataflow templates live here: https://github.com/GoogleCloudPlatform/DataflowTemplates.
I'm developing a web extension in on-premises TFS 2017U3.
It uses the ExtensionDataService to store data.
Normally, we're not supposed to manipulate tfs data directly via database script, and I assume that applies to the Extension.tbl* tables as well.
I've searched without success for a tool to help manipulate this data, for purposes such as migrating data across environments, or scripting initial load, etc.
I also found this VSTS SyncMigrator but as far as I can tell it doesn't handle extension data.
Should we just build our own tool to do this?
Based on my knowledge, there is no way/tool to migrate extension data only. You could migrate the whole TFS by checking the following link:
https://learn.microsoft.com/en-us/vsts/tfs-server/admin/move-across-domains
Meanwhile, I have submitted a User Voice at website below, you can vote it:
https://visualstudio.uservoice.com/forums/330519-visual-studio-team-services/suggestions/33538582-provide-api-or-tool-to-migrate-extension-data-in-t
I ended up writing a quick command-line interface using vsts-node-api.
All,
What table stores repo names in Crucible? I need to build a list of them using sql query.
Thanks,
Repository names are not stored in database, only in config.xml on filesystem. But there is a REST API call (/rest-service-fecru/admin/repositories) to enumerate them.
(See also answer from developers on https://community.atlassian.com/t5/FishEye-Crucible-questions/Bulk-deleting-400-repositories-in-Fisheye-Crucible/qaq-p/301703)
I am using odata to query Dynamics CRM Online 2013. I am trying to track changes against particular entities. For example, I want to be able to see Old Values and New Values for Opportunities, as you would see in the Summary View. Auditing is enabled for the entities but the most I can see via odata is whether a field of an entity was changed or not, and when it was changed.
Q. If "Change Tracking" is enabled will that expose another odata entity that will give me those changed values?
I am pretty sure Audit entity data is not exposed via OData.
Please find the url for the actual usage of Change Tracking feature.
http://www.powerobjects.com/2015/10/26/change-tracking-in-dynamics-crm-2015/
Audit table is not consumable through Sdk calls.. neither odata nor soap.
On premise will allow to query using sql queries but still data is "," "~" separated.
On the other hand Change tracking is accessible through sdk call using RetrieveEntityChangesRequest message. Pls refer the link below.
But this is for primary usage of integration services to identify the modified records for upstream/downstream systems from last cycle.
https://msdn.microsoft.com/en-us/library/dn932130.aspx
Update: Reg Audit, we have some limited options though - https://yanivrdt.wordpress.com/2016/01/08/retrieving-audit-history-records-via-api/
In my program I have multiple databases. One is fixed and cannot be changed, but there are also some others, the so called user databases.
I thought now I have to start for every database one connection and to connect to each data dictionary. How is it possible to connect to more than one database with one connection by handing over the data dictionary filename? Btw. I am using a local server.
thank you very much,
André
P.S.: Okay I might find the answer to my problem.
The Key word is CreateDDLink. The procedure is connecting to another data dictionary, but before a master dictionary has to be set.
Links may be what you are looking for as you indicated in the question. You can use the API or SQL to create a permanent link alias, or you can dynamically create links on the fly.
I would recomend reviewing this specific help file page: Using Tables from Multiple Data Dictionaries
for a permanent alias (using SQL) look at sp_createlink. You can either create the link to authenticate the current user or set up the link to authenticate as a specific user. Then use the link name in your SQL statements.
select * from linkname.tablename
Or dynamically you can use the following which will authenticate the current user:
select * from "..\dir\otherdd.add".table1
However, links are only available to SQL. If you want to use the table directly (i.e. via a TAdsTable component) you will need to create views. See KB 080519-2034. The KB mentions you can't post updates if the SQL statement for the view results in a static cursor, but you can get around that by creating triggers on the view.