Renaming a Server with SharePoint 2010

From time to time, you will require a server to be renamed where you have an instance of SharePoint 2010 installed. For example, you may want to clone and rename a server for backup purposes and use it on the same network as the original installation. When carefully executed, you will have the ability to rename your server and retain a fully functional copy of your SharePoint site. Having a flexible and robust Business Intelligence platform and system offers users and operators the best overall experience.

Below are the steps that we here at Decentrix follow to rename one of our servers that contains an installation of SharePoint 2010. These steps were derived with the assistance of Alex Anikin’s blog http://aanikin.wordpress.com/2010/09/01/how-to-rename-SharePoint-2010-server/
as well as the Microsoft Technet article http://technet.microsoft.com/en-us/library/cc261986.aspx.

Renaming a Server with SharePoint 2010:

To begin, use either STSADM or POWERSHELL to rename your server

Using STSADM

  1. Verify that you are a member of the local Administrators group on the local computer in order to run Stsadm.
  2. Open a cmd window AS ADMINISTRATOR (always do this whenever running stsadm commands – even if you are a local administrator)
  3. On the drive on which SharePoint Server 2010 is installed, change to the following directory: %COMMONPROGRAMFILES%\Microsoft shared\Web server extensions\14\Bin\
  4. Type the following command, and then press ENTER: stsadm -o renameserver -oldservername <oldServerName> -newservername <newServerName>  



Using POWERSHELL
Note: We do not recommend the use of the Windows PowerShell Rename-SPServer cmdlet because it does not work if the server on which the configuration database is stored has already been renamed. Currently, SharePoint supports both PowerShell and STSADM.

1.    On the Start menu, click All Programs.
2.    Click Microsoft SharePoint 2010 Products.
3.    Click SharePoint 2010 Management Shell.
4.    At the Windows PowerShell command prompt, type the following command:
Rename-SPServer [-Identity] <OriginalServerName> -Name <NewServerName>

  • <OriginalServerName> is the original name of the stand-alone server.
  • <NewServerName> is the new name of the stand-alone server.

Rename your server in IIS

To ensure that the rename operation is complete, run iisreset /noforce at a Windows command prompt.

In Internet Information Services Manager: Right Click, Rename, Reset:

Reconfigure your SharePoint 2010 Alternate Access Mappings
In order to get BI components (SSRS, Performance Point, Excel Services, etc.) and other SharePoint Features working with the new server (and SharePoint site) names, the alternate access mappings must be updated to the name of the new server.

Central Admin (with new machine name) -> System Settings ->Configure Alternate Access Mappings





At this point, a majority of SharePoint’s functionality will be functioning properly. Performance Point, SSRS, and Excel Services should all be working. To test if the process was successful, each reporting platform should be able to open a report and refresh data connections without errors or warnings. It may be necessary to update data source definitions if the location of databases or cubes has changed.

Getting Power Pivot reporting and the corresponding SharePoint Power Pivot Gallery is a more in depth process that will be covered in a future blog.

HTML Formatted sp_send_dbmail For Smartphones

Sometimes, during long running, late night ETL processes, being alerted when things fail is too nerve wracking for comfort. Waiting around for hours for the hammer to fall can be trying. Having to log into the system and continually run status request queries ties one to a computer all night and is prone to error. A regularly scheduled notice showing the SQL agent job’s progress can be very comforting. In those situations we like to use the sp_send_dbmail procedure to create a small progress report that is nicely formatted to fit on a smartphone screen.

Since we are sending a standard HTML email, this functionality is phone brand/carrier independent. If your phone can receive an HTML email, this tool will work for you. We create a SQL Agent job that runs every 30 minutes during the ETL window and executes script similar to this:

There are several neat little tricks happening here. We build the HTML using the FOR XML PATH functionality, adding HTML format tags, creating an HTML table, etc. We grab the batch processed number using the row_number OVER functionality, and the semicolon delimited row of email recipient address is built by flattening out the distribution table by running: select @distro =(@distro + [Email] +‘;’) and then trimming off the trailing semicolon.

The final result is a simple HTML table with a header count and four columns showing the row number, the sending system id number, the time the batch finished, and the number of minutes it took to process.

As the night progresses and this table grows to encompass the total number of expected batches, the on call technician continues to receive reassuring emails that all is well and proceeding as planned. Of course, an exception handling system that alerts the staff when a process fails is also required, but this “Management by Inclusion” is a nice addition to an exception alert scheme.

SSIS Data Sources in Configuration files

One of the best ways to make your SSIS packages easily moveable between environments is to use configuration files to store Data Connections.

Once you have created your SSIS Package click on the main control flow window and select “SSIS” from the menu bar and then select “Package Configurations”

 

In the Package Configurations Organizer window select “Enable package configurations” and then click “Add…”

The “Package Configuration Wizard” opens and defaults to a Configuration type of “XML configuration file”. Select the location and filename that will hold your configuration information (remember, this location must be the same on all environments, so make sure you have the same folder structure on every server).

In the Objects window find the Connection Manager you would like to have stored in the configuration file. Select the attributes to be stored. If you are using Windows Authentication you can simply select “ServerName” and “InitialCatalog”. If you are using a SQL user for authentication you will need to also select “Name” and “Password”. Keep in mind that the password will be stored visibly in the configuration file, so file security should be considered to keep this information safe. As a default, SSIS uses the information that you entered when creating the Data Source in the package. If this is not correct you will need to update the file once created.

Once you have selected the options to be stored click next and name the Configuration. We find the best practice is to name the Configuration the same as the Data Connection. After the file is created you can copy it to all environments updating the values in the file for each appropriate environment.

Now that you have created the configuration file you can easily move the SSIS package from environment to environment eliminating the need to update the package itself for each environment’s connections.

2011 Microsoft Partner of the Year Award Finalist!

Decentrix is honored to announce that it has been selected as one of two (2) finalists for the Microsoft Communications Sector 2011 Partner of the Year Award. Awards were presented in mid July 2011 for multiple categories, with winners and finalists chosen from a set of more than 3,000 entrants worldwide.

The Communications Sector Partner of the Year Award recognizes Decentrix as adding business value to customers by deploying new and innovative solutions based on Microsoft technologies. More specifically it is for significant customer impact, solution innovation, speed-to-market, deployment, and utilization of advanced Microsoft technology features over the past year.

To read the full Press Release…Decentrix Press Rlease

Cube Synchronization through Analysis Services DDL Tasks in SSIS

One of the most common things we do in the Business Intelligence (BI) world is to process and synchronize OLAP cubes. There are several methods people use for synchronizing, each with their own benefits, but normally we use the XMLA script generated by the synchronize wizard and use it in an SSIS package called by a scheduled job. Recently a client server sync job started failing because of a lack of disk space.

The cube on the server was 220 GB and there was about 200 GB of free space on the disk. The problem is that all of the cube files from the source cube are copied to the secondary server so that the files can be compared, deleted, and merged leaving a final set of files representing the synced version of the cube. This can be problematic because this method requires that you have enough disk space to store two complete cubes, in our case 440 GB worth. While this is not much space in the grand scheme of Business Intelligence, we were dealing with older hardware that was used for production reporting and couldn’t wait for additional disk to be added.

Our solution was to drop the analysis services database prior to syncing so that we would only have one set of files to work with. According to the Analysis Services Synchronization Best Practices article on SQLCat, some tests showed that syncing was actually faster when dropping the cube prior to syncing. Another reason dropping the database prior to syncing was an attractive option for us was because we were dealing with locking issues during the normal sync. Users were running queries through excel workbooks that would lock the database and not allow the sync to finish, in some cases for up to 4 hours

To test this scenario out we set up an SSIS package with two DDL tasks and one data connection to accomplish this.

Drop Database Task:

Sync Cube Task:

 

The first step executed successfully but the Sync Cube step failed with error.

We did a quick internet search and found quite a few forums where people were having the same problem with no apparent solution in that their tasks failed when trying to execute ddl tasks when there wasn’t a database present first. We decided to try modifying the data connection because the error actually makes sense in that the connection manager can’t find the database because we just dropped it in the previous step. We modified the connection to point at the entire instance by removing the initial catalog, tested the connection and everything looked good.

We executed the package and this time, after about an hour, we had a successful cube sync.

BI on the Windows Phone 7 Platform

The intention of this series of blog posts is to show developers the process of setting up, creating, maintaining, and delivering meaningful Business Intelligence content using the Windows Phone 7 mobile platform as the end user client side component.

Getting started

Before we can deliver BI to the handset, we have to configure a development environment specifically for WP7 work. I’m going to describe our environment and procedures, you may have more success with different setups, this is how we did it:

  • The first thing is to have “Silverlight for Windows Phone” show up in the Installed Templates pane in Visual Studio 2010 when you click File-> New Project:

This functionality did not come with the install of Visual Studio 2010 Premium; we had to download the SDK to get it to appear. You can download and install a copy of the SDK, called the Windows Phone Developer Tools, here: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=13890

Once you have downloaded and installed the Developer Tools, upgrade the installation with the most recent update: http://www.microsoft.com/download/en/details.aspx?id=23854

As of this writing there are two files to download for the January 2011 update, be sure to download and install both VS10-KB2486994-x86.exe and WindowsPhoneDeveloperResources_en-US_Patch1.msp. These updates include a phone emulator that exposes the copy/paste functionality, a phone capability detection tool, and the functionality that allows you to connect a Windows Phone to a PC without having to have Zune installed and running on the PC.

Once all software and updates are installed, select Windows Phone Application from the Silverlight for Windows Phone templates, name your project and click OK. The development environment will open with a XAML shell project split to two panes, a WYSIWYG phone emulator on the left with XAML code on the right. To test your environment, ensure “Windows Phone 7 Emulator” is selected in the target dropdown and click the Start Debugging arrow, or hit F5.

If everything is set up correctly your application should connect to the emulator and run in debug, showing an interactive phone emulator on the screen.

  • Charts and graphs are usually major components of any BI analytics system. For charts and graphs on the Windows Phone platform we turn to the Silverlight Data Visulation Toolkit.
  • We will cover how to add the charting controls to your application in the next WP7 blog post…

Moving Sharepoint Content to a New Environment

The situation often arises when users want to move the content of one SharePoint site to another. The reason could be because content was created in a development environment and needs to be moved to production or the reason could be the content needs to be packaged for deployment on multiple SharePoint portals. The easiest way to move the entire content of one SharePoint portal to another is by using one of the numerous backup and restore functions available at the SharePoint Farm, Web, or Site levels. But, this is not a complete and robust solution. Lots of the content within SharePoint still references latent information from the previous environment. BIAnalytix offers best practices to update your Business Intelligence report content to the new environment through a series of procedures. These procedures are different depending on which reporting tool you are using.

The variety of content that a SharePoint portal can server up is quite large and at Decentrix we focus primarily on content that pertains to Business Intelligence. The strategies outlined below work for both SharePoint 2010 as well as SharePoint 2007. Some of these strategies are implemented when designing SharePoint and others are steps that need to be executed as part of the process to move content from one site to another.

Designing SharePoint Dashboards and Navigation Links in a Portable Environment
When designing your SharePoint site (especially one where you plan on moving or replicating content); it is extremely important to use relative URL pathing vs. using full URL pathing. Relative URL paths should be used for your navigation links (both global and local), links to images, links on a dashboard to reports within your site, etc. If you take care to design your links using relative pathing, when you move the content you will not have to make any updates to the structure of your site.

Updating SSRS Reports and Data Connections

The data connections of SSRS reports are not updated automatically when you move your content to a new site. Luckily, deploying SSRS reports is easy with Visual Studio BIDS. All that needs to be done is update your data connections and deployment paths to your new environment and redeploy your reports to the new site.

Updating Performance Point Reports and Data Connections

Data connections for Performance Point reports are not updated when moving content either. But correcting the problem is also not difficult. You can update your data connection that reports use via the Performance Point designer software. You can test that your reports are using the correct data connections in the new environment, save everything and you are done.

Updating Excel Services Reports and Data Connections

The most difficult aspect of moving SharePoint content is moving Excel Services Reports. First you must make sure that Excel Services Trusted Locations and Trusted Data Connections are correctly set to the new environment. However, that does not get you to a point where your Excel Services content is available to your users. Each Excel file has an embedded data connection. The embedded connection will not update after moving the SharePoint content even if “refresh data when opening the file” and “always use connection file” are check.


What can you do then? Luckily, we here at Decentrix have just the fix. We license a SSIS component that will take all of your Excel Services content within your SharePoint Farm and update it to reference the new environment. If you are interested in licensing this tool, please contact us here at Decentrix.

Cube Data Ranking by Dimension

There are many business requests to rank aspects of their business by the revenue that is generated. Adding new rankings as new dimensions expands the footprint of a cube and can reduce cube build and query performance. Instead of adding ranking dimensions BIAnalytix recommends implementing ranking functionality via MDX (Multidimensional Expressions) or DAX (Data Analysis Expressions).

I received a request this week to rank members of a network dimension by revenue per year. I wanted to implement this without adding a new dimension to the cube because next week I could be asked to rank members of the production dimension by revenue per month. So instead, I wanted to have a dynamic and robust ability to create as many dynamic rankings as possible while also being able to report, slice / dice, and drill against my entire BIAnalytix cube.

I created rankings of Networks by the Revenue that they generated in a given year. I wanted to using Power Pivot and Excel to solve my problem. I came across the following entry http://powerpivotpro.com/2010/03/08/writing-a-rank-measure-and-living-to-tell-the-tale/ that looked like a very promising solution.

Using the following DAX code I was able to get a ranking in Excel.
COUNTROWS(
FILTER(
ALL(Table[Dimension]),
[Measure](Values(Table[Dimension])) < [Measure]
)
) + 1

In the query above, the COUNTROWS function counts the number of rows that have a MEASURE value greater than the current row’s MEASURE. The FILTER function to generate a table by filtering all of the rows in a DIMENSION to only those that DIMENSION rows that have a MEASURE value less than the current DIMESION row’s MEASURE value.

The piece that caused me the most problems is you CANNOT do this as a new calculated column in your Power Pivot data source. If you do try and create it in the Power Pivot Data Source you will get the error: “The function used in this expression is not a valid function nor the name of a measure.”

The DAX code must be implemented using a new Power Pivot Measure from the Excel Ribbon.

The new measure will be added in your field list of your power pivot data source.

You should now be able to add your measure to your report showing off a dimensional rank.

Why is a Corporate Portal vital to your BI Initiatives?

Why is a Corporate Portal vital to your BI Initiatives?

Without an easy way to disseminate analysis that supports business decisions throughout your corporation, the value of your Business Intelligence is severely limited. A Corporate Portal is now acknowledged as the simplest and most effective method for the presentation and dissemination of Media Business Intelligence to all stakeholders in the organization. Portals provide a corporate-wide, customizable and authenticated vehicle for referencing critical information contributed by all authorized staff. It is a valuable empowerment tool for a media corporation.

The Portal environment uses the trusted authentication of the user at the time they login to the network to support a personalized view of their access to the corporate information. This personalization occurs in a number of important ways. Their personal filter values (identifying their analysis preferences) can be applied to dashboards to present filtered data for the user’s normal day to day responsibilities. Independently, through the data level security, users are only allowed to view their authorized data. Users can also save their own favorite lists and personal reports on their personal home site. So having the Media Business Intelligence reports delivered by a Portal adds considerable personalization and relevance to the analysis, and ensures that individuals have access to all the data they need but only the data that is permitted to them.

In particular, the KPI (Key Performance Indicator) values defined as the agreed corporate wide goals can be automatically filtered to the scope of responsibility of the individual on personal site home pages and dashboards. This presents immediate feedback to the individual in how they are contributing to corporate goals on a daily basis. Portals also provide web browser level access to information. This means that there are no special installation prerequisites that could limit the user’s access to this important corporate information.

BIAnalytix is a media data warehouse and business intelligence system from Decentrix which includes portal reports and dashboards for the presentation of the Media Business Intelligence analysis. The user’s authentication on the portal is used for both active presentation filtering as well as for the data level security for information retrieval and display. Decentrix provides in-depth training to your power users who support the ongoing customization of your portal reports and dashboards that evolve to continue to meet the corporation’s strategic requirements.

Traditional Projection Reports vs Revenue Forecasting

Traditional Projection Reports vs Revenue Forecasting

Projection reports showing the future revenue currently on the books is one of the Media Industry’s traditional reporting cornerstones. However, more powerful analysis that transforms these figures into a forecast of the “final” values provides significant advantages for decision making because of the numerous comparison values available such as goals, budgets, expectations, past historical final values, etc.

Operational Traffic systems typically produce booked business status reporting such as a 6 week, or 6 month “projection” report by a variety of criteria (dimensions), such as by salesperson, or advertiser, or product codes. These operational reports provide no real forecast of what your ongoing bookings will finally invoice, only what is currently booked for the future. A Forecast of your future business needs to come from formulas and models, (possibly using past pacing), that realistically calculate the future expected values such as unsold inventory, invoice revenue, etc, and add these to your current booked business. One simple example of this approach is to multiply unsold capacity by your historical average rate and add this to booked business (and there are many other, more sophisticated models which can be used).

By knowing in advance where your business is heading, using facts rather than instinct, allows Media decision makers to consistently interpret their future business situation with more clarity earlier so that actions can be taken to exploit, change or correct the future outcome. Decision making is simplified because forecasts can be easily compared to the expected or budgeted results or to historical performance, for example, “are we forecasting being oversold?”. Consequently, we strongly recommended the inclusion of trusted forecasts (using formulas that have proven reliable for your business) to the standard reporting to simplify decision making.

BIAnalytix is a data warehouse and business intelligence system from Decentrix which includes all the advanced tools necessary for powerful forecasts to be based upon past pacing performance and to include the resulting exception analysis for a powerful media BI system.