SharePoint For The Small Business

Whilst having a quick break from work, I did my usual look on my Twitter account and found the following post by @SP365UK – and thought I should Share it with you.

I deal with many small business during my day job at Orchid IT and think this Post is very relevant to many of our customers:

Why SharePoint?

Most of the blog posts, articles and speaker presentations I have seen deal with SharePoint implementations of hundreds or even thousands of users. But a very small company has several characteristics which make it also well suited to take advantage of much of what SharePoint has to offer. To start, a small business has a limited number of team members who must work well together to achieve the goals of the business; hence collaboration is inherent and is not a barrier to adoption of a collaboration platform. A small business also is (or had better be) very agile in its ability to quickly try new ways of doing things IF the new things can be seen to quickly and directly benefit the business…


Why SharePoint?

…Because an unfortunate additional characteristic of a small business is its lack of time and money to devote to IT projects. In the case of my company, I was the co-owner, head of finance, head of human resources, and constituted the entire IT department; I think that scenario is pretty common in a very small company. But although resources are limited, there are still business information needs which must be met, quickly and efficiently. A business such as this can’t afford to spend tens of thousands on full-blown ERP or other information systems, so an easily customizable & scalable platform such as SharePoint just fits the bill.

As a small business with very limited financial resources, we took full advantage of many of the free, or nearly so, SharePoint resources available. This is by no means an exhaustive list, but rather represents the resources that I personally found to be essential:

Start with the “big win”

I’ll describe below some of the ways in which we were able to leverage SharePoint to improve our team members’ ability to do their jobs.

The first, and most essential, thing we did with SharePoint — the thing that got me so excited about SharePoint when I first learned about it at a user group meeting several years ago — was to bring disparate information regarding our custom manufacturing projects into one readily accessible location. We had several essential Excel spreadsheets regarding different aspects and statuses of our projects, maintained by different people, and containing different (but obviously related) information. We used an Access database to track bids, and we updated Word documents manually each week to communicate bid and job information to the managers and employees.


By creating a few custom lists (but primarily a “Jobs” list), we were able to bring all of the important information about bids and jobs into a single repository to which everyone had access, and which could be viewed in any way that was required. We were able to fairly quickly eliminate the use of the tracking spreadsheets and the report documents, by requiring everyone to make their project updates in SharePoint only, and to use SharePoint views to report on-screen at operations meetings. Everyone very soon saw the advantage in having up-to-the-minute information on all current work (usually around 30 projects) at their fingertips. Over time, people thought of more and more bits of job information that would be great to have access to in SharePoint, so our list grew and spawned other lists such as Customers, Architects, Change Orders, etc. More and more views were created on that same data, such as a Poster we could print and post in the lunchroom for the plant workers, various bid analysis views (including one that fed an Excel PivotChart), and a view for the drafters to track the status of their work.

Move all databases (SharePoint Server 2010)

 

The following kinds of databases hosted on a single database server can be moved by using the procedures in this post:

  • Configuration database
  • Central Administration content database
  • Content databases
  • Service application databases

 

Moving all databases

The process of moving all of the databases from one database server to another database server requires you to work in both SharePoint Server 2010 and SQL Server. The following list summarizes the process of moving all databases, with detailed steps presented in the subsequent procedures:

  1. Prepare the new database server. For details, see To prepare the new database server.
  2. Close any open Windows PowerShell management shell windows and any open Stsadm command prompt windows. For details, see To close any open management sessions.
  3. In the Services Microsoft Management Console snap-in, stop all of the services related to SharePoint Server 2010 and Internet Information Services (IIS). For details, see To stop the farm.
  4. In SQL Server, detach the databases from the current instance. For details, see To detach databases.
  5. Using Windows Explorer, copy or move the .mdf, .ldf, and .ndf files associated with the database from the source server to the destination server. For details, see To move database files to the new server.

    noteNote:

    You can also back up all databases and restore them to the new server. Procedures for backing up and restoring all databases are not included in this article. For more information, see How to: Back Up a Database (SQL Server Management Studio) (http://go.microsoft.com/fwlink/?LinkID=179208) and How to: Restore a Database Backup (SQL Server Management Studio) (http://go.microsoft.com/fwlink/?LinkID=183032).

  6. In SQL Server, ensure that all of the SQL Server logins, fixed server roles, fixed database roles, and permissions for the databases from the source server have also been configured correctly on the destination server. For details, see To set up permissions on the new server.
  7. In SQL Server, attach the database to the new instance. For details, see To attach databases to the new instance of SQL Server.
  8. Use SQL Server connection aliases to point to the new database server, and then use the connection alias to update all Web servers. A connection alias is a defined alternate name that can be used to connect to an instance of SQL Server. You have to configure the alias on all Web servers and application servers in the farm. For details, see To point the Web application to the new database server by setting up SQL Server connection aliases.

    noteNote:

    The use of SQL Server client aliases is recommended as part of hardening SQL Server for SharePoint environments. For more information, see Harden SQL Server for SharePoint environments (SharePoint Server 2010).

  9. Restart the services. For details, see To restart the services in the farm.

The following are the minimum permissions that are required to perform this process:

  • You must be a member of the Farm Administrators SharePoint group.
  • On the computer that is running the SharePoint Central Administration Web site, you must be a member of the Administrators group.
  • On the database server from which the databases are being moved, you must be a member of the following:
    • The Administrators group
    • The db_backupoperator fixed database role
  • On the database server to which the databases are being moved, you must be a member of the following:
    • The Administrators group
    • The db_owner fixed database role

In some environments, you must coordinate the move procedures with the database administrator. Be sure to follow any applicable policies and guidelines for managing databases.

To prepare the new database server

To close any open management sessions
  • Close any open Windows PowerShell management shell windows, and any open command prompt windows if you have been running the Stsadm command-line tool.

To stop the farm
  1. On the server that is running the Central Administration Web site, in the Services snap-in, stop the following services:

    • SharePoint 2010 Administration
    • SharePoint 2010 Timer
    • SharePoint 2010 Tracing
    • SharePoint 2010 User Code Host
    • SharePoint 2010 VSS Writer
    • SharePoint Foundation Search V4
    • World Wide Web Publishing Service
    • SharePoint Server Search 14
    • Web Analytics Data Processing Service
    • Web Analytics Web Service

      noteNote:

      The final two services are part of the Web Analytics service application. If you are running the Web Analytics service application and choose to rename your server, you must also reconfigure the Web Analytics database locations. For details, see To reconfigure Web Analytics database locations .

  2. On the server that is running the Central Administration Web site, at the command prompt, type iisreset /stop.

To detach databases

To move database files to the new server
  • Using Windows Explorer, locate the .mdf, .ldf, and .ndf files associated with each database that you are moving, and then copy or move them to the destination directory on the new computer that is running SQL Server.

To set up permissions on the new server

To attach databases to the new instance of SQL Server

To point the Web application to the new database server by setting up SQL Server connection aliases
  1. Start SQL Server Configuration Manager. On the Start menu, point to All Programs, point to Microsoft SQL Server 2008, point to Configuration Tools, and then click SQL Server Configuration Manager.

    note Note:

    If SQL Server Configuration Manager is not installed, you must run SQL Server setup to install it.

  2. Expand SQL Native Client Configuration, right-click Aliases, and then click New Alias.

  3. In the Alias Name field, enter the name of the original SQL Server instance, for Protocol, verify that TCP/IP is selected, for Server, enter the name of the new server that is hosting the SharePoint Server 2010 databases, and then click OK.

  4. Repeat this procedure on each Web and application server.

  5. Optional. If your environment relies on System Center Data Protection Manager (DPM) 2010 or a third-party application that uses the Volume Shadow Copy Service (VSS) framework for backup and recovery, you must install the SQL Server connectivity components on each Web server or application server by running SQL Server setup. For more information, see How to: Install SQL Server 2008 R2 (Setup) (http://go.microsoft.com/fwlink/?LinkID=186119).

To reconfigure Web Analytics database locations
  1. note Note:

    This procedure is required only if you are running a Web Analytics service application and if you have renamed your server instead of using SQL Server connection aliases.

    On the SharePoint Central Administration Web site, under Application Management, click Manage Service Applications.

  2. Select the Web Analytics service application, and then click Properties.

    The Edit Web Analytics Service Application wizard appears.

  3. Click Next.

  4. On the second page of the wizard, update the location of each Web Analytics database to the new SQL Server instance, and then click Next.

  5. In Central Administration, under System Settings, click Manage Services on Server.

  6. Stop and restart the Web Analytics Data Processing Service, and the Web Analytics Web Service.

    note Note:

    The SharePoint Web Analytics feature relies on SQL Server service broker to function. The SQL Server service broker cannot be started manually. A SharePoint timer job runs one time per day to ensure that SQL Server service broker is enabled on the necessary databases.

  7. After moving databases, you should manually run the health rule “Web Analytics: Verifies that the SQL Server Service Broker is enabled for the Web Analytics staging databases.” To manually run the health rule, follow these steps:

    • In Central Administration, click Monitoring.
    • In the Health Analyzer section, click Review rule definitions.
      The All Rules page is displayed.
    • Under Category: Configuration, click the health rule Web Analytics: Verifies that the SQL Server Service Broker is enabled for the Web Analytics staging databases.
      The Health Analyzer Rule Definition dialog box opens.

      noteNote:

      In order to see the health rule, you may need to click the right arrow at the bottom of the All Rules page.

    • On the ribbon of the Health Analyzer Rule Definitions dialog box, click Run Now.

To restart the services in the farm
  1. On the server that is running the Central Administration Web site, at the command prompt, type iisreset /start.

  2. In the Microsoft Management Console Services snap-in, start all of the services related to SharePoint and Internet Information Services (IIS). These include the following services:

    • SharePoint 2010 Administration
    • SharePoint 2010 Timer
    • SharePoint 2010 Tracing
    • SharePoint 2010 User Code Host
    • SharePoint 2010 VSS Writer
    • SharePoint Foundation Search V4
    • World Wide Web Publishing Service
    • SharePoint Server Search 14
    • Web Analytics Data Processing Service
    • Web Analytics Web Service

How to create a Forest Trust in Windows Server 2008 R2

 

Prerequisites

Before a trust can be established, DNS must be setup between the two domains; this can be accomplished in a few different ways by either using stub zones, conditional forwarders, or active directory federation services.   Also, the two domains must have the same or close to the same forest functional level.  You can check the forest functional level by going to Administrative Tools -> Active Directory Domains and Trusts.  Then, right-click on the forest root and select Raise Forest Functional Level.

Tutorial

1. Go into Active Directory Domains and Trusts inside of Administrative Tools.  Once inside you should see something similar to the next screen.  Right-click the domain you would like to create a trust for and select Properties.  In this tutorial, the domain we will create a trust for is called misdivision.net.

2.  Inside of properties, select the Trusts tab.  You should see something like the next screen.  Select New Trust.

3.  This is the New Trust Wizard.  Select Next.

4.  In this tutorial we are going to create a forest trust.  For a forest trust, the trust name must be a DNS name.  We are going to create a trust with a domain called globodivision.com.  Select Next after specifying the trust name.

5.  Here you select the trust type.  A forest trust, the one we are creating, creates a transitive trust between all users on both forests specified by both forest root domains.  The other option is to create an external trust between just the two domains; external trusts are non-transitive.  Select Forest Trust and then select Next.

6.  Here you specify the direction of the trust.  A two-way trust means users in both domains can be authenticated on the other domain.  One-way means that one domain’s users can be authenticated on the other domain, but not the other way around.  One-way trusts can be established as incoming or outgoing, meaning that they can be setup one-way for the domain you are setting up the trust on currently or the other domain.  Select Two-way and select Next.

7.  Here you can set up the trust on this domain or both domains involved in the trust.  Select Both this domain and the specified domain.  You can only do this if you have credentials for the other domain.  If you do not have credentials for the other domain, you would have to get an administrator for the other domain to create the other side of the trust.  Select Next.

8.  Input administrative credentials for the other domain to automatically establish the other side of the trust on that domain.  Select Next when finished.

9.  Here you can specify whether local forest users will automatically be authenticated for all resources on the other domain or selectively be authenticated for resources on the other domain.  Forest-wide authentication is generally recommended for users within the same organization.  Select Forest-wide authentication and select Next.  The next screen is similar but it is for the specified forest.  Again, Select Forest-wide authentication and select Next.

10.  You can review the selections you have made here.  Select Next when you have verified they are the selections you wanted.

11.  If your trust was created successfully, you will see this next screen.  There are a few reasons that you may not be able to set up a trust.  DNS between the domains may not be set up properly; make sure that name servers on one domain can access servers on the other domain.  Make sure you have the correct administrator credentials for the other domain.  In a lab environment, you may not be able to set up a trust if two virtual machines were deployed from the same server template.

12.  The next few screens of the wizard will ask if you want to confirm both sides of the trust.  Select Yes for both and select Next.

13.  This is the last screen of the wizard.  Select Finish after verifying the changes.

The new trust now appears under Trusts in the properties of misdivision.net.

On the domain controller of the other domain, you can verify that the trust was created by going to Administrative Tools -> Active Directory Domains and Trusts, right-click the domain, and select the Trusts tab under Properties.  The other side of the trust was created automatically because we selected the Both this domain and the specified domain option in Step 7.

Once the trust has been established, you will be able to grant permissions to users to access resources on the other, trusted domain or add users to groups with permissions on the other domain.

Importing PSTs with PowerShell in Exchange 2010 SP1

Importing PST files in Exchange Server 2010 has always been difficult and prone to error. You had to use a 32-bit management workstation, or you had to install the Mailbox Server role on your management workstation, or you had to install Outlook on your Exchange Server; basically, there are all kinds of issues that can make the SysAdmin’s life difficult. Thankfully, things have changed in Exchange Server 2010 SP1 though. Time for a closer look…

 

Mailbox Replication Service

Importing PST files has always been difficult because the process relied heavily on some of the Outlook DLL’s, but this is no longer the case in Exchange Server 2010 SP1. Microsoft has completely reengineered this, and built PST reading/creation logic directly into Exchange Server 2010 SP1. The process is now integrated into the Mailbox Replication Service (MRS), the same engine that’s also responsible for moving mailboxes between Mailbox Databases. The MRS is a process that runs on the Client Access Server, continuously scanning the Mailbox Database for move requests. If it finds a move request, it will process it and start moving a mailbox from one Mailbox Database to another. This is an online process, so users hardly notice that their mailbox is being moved. The mailbox is still accessible while it is being moved ( and that’s true when moving from Exchange 2007 to Exchange 2010 or between Exchange 2010 Mailbox Databases), and the mailbox also continues to accept new messages as they arrive on the Hub Transport Server.

A similar process is used for importing PST files into mailboxes (and exporting PST files from mailboxes, of course); the only difference is that the source of the mail is not a mailbox, but a PST file.

Figure 1: The Mailbox Replication Service is responsible for moving and importing mail data

Import Mailbox

Suppose we have an Exchange administrator (ExAdmin) and, after he implemented Exchange Server 2010 SP1 (including the Personal Archives), he is now responsible for importing the PST files into the Mailboxes and Archives.

However, when the ExAdmin initially opens the Exchange Management Shell and enters the appropriate cmdlet (New-MailboxImportRequest) an error message is generated:

By default, no one has the required permission to perform these actions, and therefore the cmdlets are not available; to enable the mailbox import functionality, the “Mailbox Import Export” role needs to be assigned to the ExAdmin user. ExAdmin needs to Logon as the administrator (in this case, the user that first installed Exchange Server 2010, or anyone else who has been given the Organization Administrator privilege) and execute the following command:

New-ManagementRoleAssignment –Role “Mailbox Import Export” –User ExAdmin

The ExAdmin user now has the appropriate permission, and can execute the cmdlets after he closes and re-opens the Exchange Management Shell. That’s all it takes…

The cmdlet to import PST files into the mailboxes is New-MailboxImportRequest. It takes a number of parameters, but the most important ones are, of course, the mailbox and the path to the file share where the PST file is located. The request does not accept a local directory, so you have to use a file share, but there’s a snag here: when creating the file share, you have to grant the security group “Exchange Trusted Subsystem” read/write permissions on that file share. A nice feature to see in this cmdle is the –IsArchive parameter, which is responsible for importing a PST file into the user’s Personal Archive. However, note that to use the Personal Archive, you’ll need an Enterprise Client Access License (eCAL).

So, to import a PST file into Johan’s mailbox, you have to enter the following command:

New-MailboxImportRequest –Mailbox Johan –FilePath \\2010AD02\PST-Files\johan.pst

You should be aware This command is not available in the Exchange Management Console in Exchange Server 2010 SP1 – only in the Exchange Management Shell.

Performing the actual import of the data into the mailbox is a process which runs on the Client Access Server, as explained earlier when I mentioned the Mailbox Replication Service. So, after entering the import cmdlet, it is therefore possible to close the Exchange Management Shell and log off from the server or management workstation which has the management tools installed.

To monitor the status of the import request, you can use the Get-MailboxImportRequest and the Get-MailboxImportRequestStatistics cmdlets when needed, combined with a selection of parameters, such as the demonstrated in the command below:

Get-MailboxImportRequest | Get-MailboxImportRequestStatistics | `
ft TargetAlias,Percent*,BytesTransferred*

It is also possible to enter multiple requests at the same time, or close to each other. The requests will be queued, and when the Client Access Server picks up a request it will then process them sequentially.

Now let’s demonstrate how to monitor the progress of an import action. First of all, to import a PST file into the user Johan’s Personal Archive directly soon after starting the previous import into his primary mailbox), you can use the following command:

New-MailboxImportRequest –Mailbox Johan –FilePath \\2010AD02\PST-Files\johan-archive.pst `
-IsArchive

Now, when you enter the previous command for monitoring the import status, you’ll see two entries. In this example, the actual primary mailbox is already finished (100%) and the Archive import is still running:

Now that we’ve got the basics, I’ll mention that it’s possible to exclude certain folders during the import using the –ExcludeFolders parameter, and it is also possible to use a –SourceRootFolder parameter to define what data to import from within the source PST; everything outside of the targeted –SourceRootFolder will not be imported. Similarly, the –TargetRootFolder specifies the location in the mailbox where the import will store its data, for example in a \RecoveryFolder location in the mailbox.

Interestingly, it is also possible to perform bulk import of PST files. Suppose there are a number of PST files in the file share we’ve used earlier; it is now possible to request a list of those PST files, and pipe this output into the New-MailboxImportRequest:

Dir \\2010ad02\PST-Files\*.pst | %{
New-MailboxImportRequest –Name ImportOfPst –BatchName ImportPstFiles `
–Mailbox $_.BaseName –FilePath $_.FullName
}

The mailbox which is to be targeted, is identified from the file name of the PST file; so student-1.pst will be automatically imported into the mailbox named “student-1”. When importing multiple PST files in the same batch, you’ll notice that only one or two imports are active at the same time; other requests have the status “Queued” and are only processed when previous imports are finished.

One of the issues top bear in mind is to not overload the Client Access Server and the Mailbox Server during a PST import, which is why the import process is throttled. The good thing is that this throttling is fully configurable; so you can, for example, really stress your Mailbox Servers and Client Access Servers when you’re performing bulk imports on a Saturday or Sunday. The configuration for this is stored in a config file called MSExchangeMailboxReplication.exe.config, which is stored (by default) in the directory C:\Program Files\Microsoft\Exchange Server\V14\Bin on the Client Access Server; when you open it with Notepad, you’ll find some “MaxActiveMoves” parameters which you can edit (which you can see in the image below). Please note that this file exists on every Client Access Server, which means that you’ll need to adjust across all CAS machines if you have a large Exchange organization & are doing a bulk import / export.

As you can probably guess, increase the number of moves per target MDB and your Import Request will process more than just two PST files at a time.

Export Mailbox

Besides importing PST files into mailboxes, it is also possible to send mail the other way, and export mailbox information into PST files. You may want to do this when, for example, you want to delete a mailbox after an employee has left the company. The concept is exactly the same as importing data, except that the cmdlet to use is now New-MailboxExportRequest. To export a particular mailbox to a PST file you can enter the following command:

New-MailboxExportRequest –Mailbox J.Wesselius –FilePath \\2010AD02\PST-Files\J.Wesselius.pst

And if you want to export only the Personal Archive to a PST file, you can use the –IsArchive parameter again:

New-MailboxExportRequest –Mailbox J.Wesselius –FilePath \\2010AD02\PST-Files\J.Wesselius.pst `
-IsArchive

That’s nice and simple, and things actually become more interesting when you start using filtering. It is possible to filter on specific keywords and on specific dates; so, for example, if I want to export all messages related to “Simple Talk” that were sent and received before January 1st 2010, I can use the following command:

New-MailboxExportRequest –Mailbox J.Wesselius `
–ContentFilter {(body –like “*Simple Talk*”) –and (Received –lt “01/01/2010”)} `
–FilePath \\2010AD02\PST-Files\JW-SimpleTalk.pst

Monitoring the export process is just like monitoring the import process, and you’ll need to use the following Exchange Management Shell command:

Get-MailboxExportRequest | Get-MailboxExportRequestStatistics | `
ft Sourcealias,per*,BytesTransferredPerminute

 

Conclusion

The import and export mailbox processes in Exchange Server 2010 SP1 have been improved, and – to be honest – it was about time. Importing PSTs is handled by the Mailbox Replication Service, the same engine that’s being used by the online Move Mailbox technology.

The New-MailboxImportRequest and New-MailboxExportRequest cmdlets have quite a few parameters available, and you can filter the commands to customize them to your needs. The only downside is that the process is command-line only, but since Exchange 2007 came with PowerShell 3½ years ago, we are hopefully getting used to that.

Using the Exchange 2010 SP1 Mailbox Export features for Mass Exports to PST files

 

In Exchange 2007 SP1 thru to Exchange 2010 RTM, the Export-Mailbox command was the replacement for the once-familiar ExMerge utility when it came to exporting mailboxes to PST files.

The main problem with Export-Mailbox for most Exchange administrators is the requirement for Outlook – either on a 32-bit machine with Management Tools for Exchange 2007, or on a 64-bit machine for Exchange 2010. All in all, it wasn’t ideal and certainly didn’t facilitate scripted mailbox exports.

Thankfully, with Exchange 2010 SP1, Export-Mailbox is going the way of the dodo and new cmdlets for Mailbox imports and exports are available. Just like the New-MoveRequest cmdlet, the new import/export command use the Mailbox Replication Service to perform the move via one of the Client Access Servers giving performance benefits, such as ensuring the PST transfer doesn’t have to go via the machine with the Exchange Management Tools/Outlook installed, as was the case previously.

The main aim of this post is to give you an overview of how to use the new mailbox export cmdlets, and then show you how to put them to practical use, both at the command line and with a scheduled task for brick-level backups.

Getting it set up

The basic requirements for using the new feature are pretty straightforward. You need to use an account that’s a member of the organisational management groups, and have the “Mailbox Import Export” role assignment assigned to you or a role group you’re a member of. As the export is done at a CAS server (and if you’ve multiple CAS servers you can’t specify which one in each site will be used) you can’t specify a local drive letter and path – you must specify a UNC path to a network share that the “Exchange Trusted Subsystem” group has read/write access to.

Step One

Create a share on a server, and grant Exchange Trusted Subsystem read/write permission. In this example I’m using a share called Exports on a test server called Azua in my lab environment:

image

Step Two

Next, you’ll need to grant a user, or group, the Mailbox Import Export role assignment. You can do this using the Exchange Management shell with a single command. In this example, I’m granting my lab domain’s Administrator user the role assignment:

image

New-ManagementRoleAssignment –Role “Mailbox Import Export” –User AD\Administrator

After you’ve done this, close and re-open the Exchange Management shell, and you’re ready to go!

Exporting a Mailbox

At it’s simplest, use the New-MailboxExportRequest command with the –Mailbox parameter, to specify the mailbox to export along with the –FilePath parameter, to specify the PST file to create and export data to, e.g:

image

New-MailboxExportRequest -Mailbox Administrator -FilePath “\\AZUA\Exports\Administrator.pst”

In addition, there are some other useful options – such as –BatchName, which allows grouping of requests together, and –ContentFilter, which allows only certain content to be exported to the PST – useful for discovery purposes.  As usual, use the Get-Help New-MailboxExportRequest –detailed command to review the full plethora of options.

After submission of your requests, you can check progress, including the percentage complete, with the two Get-MailboxExportRequest and the Get-MailboxExportRequestStatistics commands. Pipe the former into the latter to get a listing:

image

Get-MailboxExportRequest | Get-MailboxExportRequestStatistics

After the requests complete, you can remove the requests in a similar fashion, using the Remove-MailboxExportRequest command:

image

Get-MailboxExportRequest | Remove-MailboxExportRequest

Performing mass exports

One benefit of Powershell is it’s very easy to put together commands enabling mass-exports of PST data with only a few commands. If you really wanted to, you could even use a Powershell script as a secondary brick-level backup!

The Basics

So to check out how to do this, let’s look at it’s simplest – backing up all the mailboxes (assuming it’s a full Exchange 2010 environment) to a single share:

image

foreach ($i in (Get-Mailbox)) { New-MailboxExportRequest -Mailbox $i -FilePath “\\AZUA\Exports\$($i.Alias).pst” }

In the above example, we’re simply performing a for-each loop through each mailbox and creating a new Mailbox Export Request, using the alias to build the name for the PST.

But – what if we’re in a mixed environment, and only want to target the Exchange 2010 mailboxes?

image

foreach ($i in (Get-Mailbox | Where {$_.ExchangeVersion.ExchangeBuild.Major -eq 14})) { New-MailboxExportRequest -Mailbox $i -FilePath “\\AZUA\Exports\${$i.Alias).pst” }

In this example above, now, we’ve added a clause to only select the mailboxes where the Exchange Major Build is 14 – Exchange 2010. Simple!

Moving on from such wide-targeting, you may want to target just a pre-defined list, using a CSV file. To do this, simply create a CSV file with the column “Alias”, and list the Mailbox alias fields you wish to export. Then, using the Import-CSV command we can use this CSV file to create the requests:

image

foreach ($i in (Import-Csv .\exports.csv)) { New-MailboxExportRequest -Mailbox $i.Alias -FilePath “\\AZUA\Exports\$($i.Alias).pst” }