Analysis services project lifecycle - data-warehouse

Is there a standard via which one should start an analysis services project? Out situation is that we work locally on our machines, and the relational DB is out of our domain - and the server that analysis services DB will eventually sit on is out of our domain. So, I have a bunch of questions:
Where do you set up the Star schema DB (as the source of the OLAP project)? Is it on a separate server somewhere? And then how do you have a dev star schema db, vs your production star schema DB?
When you create a new Anaylsis Project on your visual studio, where should it connect to (dev star schema? prod star schema?)
Analysis services only supports windows authentication, so how do you get around this if your local computer and the dev analysis services is not on the same server?
When doing ETL work (on SSIS) which DB do you connect to? (dev, I assume) - but then how do you deploy to production?
What about down the line if you need to make changes - how does that process work?
I apologise for haphazard questions, but I'm not really sure where to start, so if anyone has a process from start to finish that is a standard, please let me know.. thanks!

For this one SSAS project that I recently set up, here is what I did:
I have a development PC and a server. Both in the same domain. The development PC is used to edit the VS project. The server is used to host the trial staging / star schema DB, real staging DB, SSAS development cube, and SSAS production cubes.
During development, I use the trial staging DB to test ETL, and deploy to the SSAS development cube.
To build the real cube, I switch the data source to point to the real staging DB, and deploy to a new SSAS cube. Old production cubes are left unchanged so users can still access them while I deploy the new cube. After the new cube is deployed, I will announce the availability of the new cube, and then I can delete the old one. BTW I do this monthly.
If your SSAS server is not in the same domain, you can create an SSAS DB backup, and restore it in that remote server.
To enable Visual Studio to access SSAS server, I had to run visual studio as administrator (right click VS shortcut, 'run as...'). Or you can open the SSAS server with SSMS on the server, then put your active directory user as member of administrator group.

It sounds like your main problem is that you don't have direct access to the production environment/domain and you're not sure how to deploy your SSAS and SSIS work.
In both cases you'll want to develop against the development database, which is hopefully a copy of the production database.
In SSIS, you will create Connection Managers that use connection strings pointing to your development environment. Add a package configuration file to give access to package properties, like the connection string. There are other ways to manage configuration information, but a config file is the mos straightforward. When you build the project with the Create Deployment Utility option enabled, the config file and a .manifest file will be created. Together with your package(s), these files can be run on the target server to install and configure the project outside the development environment.
Similarly, SSAS has a deployment utility. There are several ways to get an SSAS project deployed to a production environment. See: Deploying an Analysis Services Database into the Production Environment for one overview of the options. Note that data sources in your project can be manually modified by an administrator by connecting to the Analysis Server database in SSMS after deployment.
Changes in SSIS are generally handled by deploying updated packages. With SSAS, changes can be scripted or the entire database can be redeployed. The approach I would take would depend on the size and complexity of the SSAS database and what is changing.

Related

I am working on creating a baseline of a developer's set up for them to 'plug and play'. What would be the best option? VM, Containers or else?

I am trying to find the best way to achieve the following scenario;
I am currently working on getting a complex enterprise web application that consist on:
DB
BPM Engine
SOA Engine
Reporting Engine
Web Application Server
IDE
The applications is currently running in non-prod and prod environment but each environment is independent (no infra as a code, and deployments go from dev -> ... -> prod).
When a new developer comes in, they can't run the system in their local machine as it involves too many components (will come to this later). So they do development in their local machine and to test, they need to publish and deploy to dev. Test, rinse and repeat.
I am currently working on reverse engineer the whole thing so I can get it working on my local machine provided that I can install and run all the components. I am nearly there after fiddling with a lot of configuration, settings, etc.
This work I would like others to use, so they can also run the project in their local machines. In fact, since we will be migrating soon, I would like to pack the whole thing in a way that I can deploy it anywhere (the app already working and configured) and parametrised somehow whether is DEV, SYS, UAT, PROD. This, according to my understanding is what a docker image would do for you correct? You do all the work and then you create an image out of it? Then you can have this image running in a container and that way, other people can 'reuse' your work?
Is this the correct way of doing it? Any hints / comments would be appreciated
Apologies for my writing.

TFS Release Task to Deploy a Web Deploy Package to specific directory

I've been digging for hours and i haven't been able to find what i would think is a pretty common scenario.
I am attempting to deploy a Web Deploy Package to my existing Web Site\Web App via a TFS Release. The location of my existing Web Apps is mapped to a different drive. My source code on my web server is not in C:\inetpub lets say its in D:\MyFiles.
I'm open to using any TFS task to do this. It seems like my two options are:
Run Batch Script - point to myApp.deploy.cmd
WinRm IIS Web App Deployment
I've seen lots of examples of overriding the computer name via the setParamater file but I have not seen one example of how to set the target path for the package?
Again, i want to deploy a web package via a TFS release to D:\MyFiles. I've created the package and it deploys locally to c:inetpub, I would assume if I can get it to deploy to a specified Target location locally then when I run that same. CMD file from TFS release it will use that location on the deploy to server.
UPDATE:
So... this just started working. I'm not sure what the issue was but the WinRm Task didn't do the deploy on Friday but did the deploy on Monday. I'm thinking it may have been related to a FQDN for the server path? Honestly I'm not sure what fixed it or what to do with this post? The answer below by #Andy may help someone so I won't delete it. That link is a good one and it showed me how to perform IIS configuration with Web Deploy.
Thanks in advance,
Greg
Seems you are trying to change the physical path of an IIS site/app using MSDeploy.
Just try adding an additional command (appcmd) to the MSDeploy package manifest to change the physical path of the IIS site during the deployment:
<runcommand path="%windir%\system32\inetsrv\appcmd set app /app.name:"Default Web Site/app12" /[path='/'].physicalPath:C:\temp\app12" waitInterval="5000"/>
Refer to this article for details:
WebDeploy/MSDeploy Quick TIP: Change IIS Site/APP Physical Path with MSDeploy

Public testing environment for RoR + Angular

currently I need a little hint for test running a RoR + Angular Application. And with test running I talk about that the "project owner" can see the current version of the project. For large projects with a lot of developers we had a build server, where the server got the current version from the git repository and deployed it as a "nightly build".
For projects where I'm the only developer I use dropbox to synchronize my working directory to a server, that is accessible for the project owner.
But now I'm working on a small project with 2 other developers. The build server is too much and the dropbox solution, well it's not going to work, because every one of us has a different state. And working on the same Dropbox directory is a no go.
So what's the best solution?
Consider running it on Heroku, which has a free tier. Once set up properly, deployments can be done by simply pushing to your Heroku git remote, or you can set it up to watch an existing remote repository branch (such as a particular branch on Github).
Or you could run it on your own machine. If you are behind a firewall you could set up a tunnel so your team can see it, with something like ngrok.

Code first migrations when publishing web role to Azure Cloud Services

I have an MVC 4 web application using code first migrations in EF5. Inside my solution, I have a Windows Azure Cloud Service project which has the MVC4 app added as a web role. This then publishes to a Cloud Services instance in Azure. I can publish without any problems and I have set up the correct web.config transforms so the deployed application is pointing to the correct Azure SQL database. The one thing I can't work out is how to get my code first migrations to run automatically when I publish - or if that's even possible publishing to Cloud Services.
After I've published the app, it will happily create the user-related databases but that's due to code in the InitializeSimpleMembershipAttribute explicitly doing that. Is it possible for me to have code first migrations run automatically when I publish to cloud services or will I need to write some code in my app specifically to do this?
I was initially looking into this for deploying my app to a UAT environment in Cloud Services. After reading up on this for a while, it would seem that the general consensus is not to use automatic migrations for deploying to UAT or production. Instead, I'll be using code-based migrations during development then when I want to publish my changes to UAT, I'll generate a script for the UAT database using the update-database -Script -ConnectionString "uatconnectionstringhere" syntax.
If anyone does actually want to have them run automatically, details are here.

does RoR develpment need shell access?

let's say that RoR development environment is set up and working
does the developer need shell access to develop the RoR application?
would ftp be good enough?
why? I don't want to give my future developers ssh access to my linux box. Or can I set up their file permission so they can read only their project directory?
UPDATE
the whole idea is to have below running on my VPS linux hosting
code repository
production environment
test environment
maybe development environment
for
few projects
that are looked after by different people
so I want the developers to be able to do their job and only be able to access their project files and maybe only I would be able do to deployment into production from test environment
As Tom mentioned, it makes life a lot easier on Rails developers if they have ssh access to the machine so they can migrate the database, run bundle install, check the logs, or just jump into console.
There are ways to segregate users though, through file/directory permissions, chroot, or but making your linux machine a bunch of virtual machines and giving them their own
You can take a look at how Heroku's client works for possible ideas, since Rails developers are able to deploy, migrate, check logs, and even get into the console without direct shell access. Deployment is all done via git hooks and then their client gives access to particular commands. This is not trivial to set up/get working, though.
Well it does not REQUIRE shell access, but it sure makes it easier.
Without it how can you migrate a db? You would have to manually create controllers, models, etc.
Short answer, you CAN develop without shell access, it is just awkward and more tedious.
This is a common situation - for instance, Network Solutions allows you to do the basic RoR install but only gives ssh access if you step up and pay extra for a VM hosting package. My suggestion is to create the app on a local machine, of course using shell commands, then FTP mirror the files up, then use mysqldump to export the local database. NSI allows you a database console whereby you can then import your database dump file. You will probably have to edit config/database.yml since the host database server is unlikely to be localhost. If the necessary gems aren't present, you will have to plead with your hosting customer service.

Resources