次の方法で共有


Sharepoint Reporting Methods

Reporting in SharePoint can be very helpful for all types of scenarios, storage growth projection, weekly service reviews, operational tracking for backups / restores and more. Below are a few scripts that I have wrote for MOSS 2007 / WSS3.0, These can be used to pull all kinds of good information that might not be available from the Front End(OM). 

Please keep in mind that any direct database queries or modifications are not supported by any means. A suggestion might be to perform these against a test / backed up copy of your production data.

Content Script 1 

The following script grabs all kinds of great data from your moss 2007 / WSS3.0 Content Database such as:

· SiteURL – Path for a site collection

· SiteAdmin – Site Owner / Admin for the site Collection

· RecycleBin – Amount of disk space RecycleBin is currently using

· BandwidthUsed – Amount of traffic the site has generated

· SiteSize – Amount of space the site takes up in the database

· SiteMaxQuota – Specific Max Quota the site has specified

· SiteID – Site GUID assigned for specific collection

· Content_DB – Database that site resides in

· ServerName – Server that Database Resides on

· LastContentChange – Last time a user modified any part of the site

· DaysSinceLastChange – Days Since last modification to site

Use <ContentDatabase>

select distinct a.fullurl as [SiteUrl],

b.tp_login as [SiteAdmin],

sum(cast(c.size as decimal))/1024/1024 as [recyclebin],

cast(d.bwused as decimal)/1024/1024 as [BandwidthUsed],

cast(d.diskused as decimal)/1024/1024 as [SiteSize],

cast(d.diskquota as decimal)/1024/1024 as [SiteMaxQuota],

d.id as [SiteID],(select db_name(dbid) from master..sysprocesses where spid=@@SPID) as [Content_DB],

(select @@servername) as [ServerName],

d.lastcontentchange as [LastContentChange],

(select datediff(day,d.lastcontentchange,current_timestamp)) as [DaysSinceLastChange]

from webs as a inner join

            sites as d on a.siteid=d.id inner join

       userinfo as b on a.siteid=b.tp_siteid left join

      recyclebin as c on a.siteid=c.siteid where b.tp_siteadmin = '1' and a.parentwebid is null

group by a.fullurl, b.tp_login, d.diskused, d.id, d.bwused, d.diskquota, d.lastcontentchange

Order by a.fullurl

 

Configuration Script 1

The following script will create a high level snapshot of how your configuration database looks from a SQL standpoint. It would be run against your configuration database. It includes the following set of columns

· ConfigServer – Server your configuration database lives on

· Config_DB – Name of your configuration database

· Content_DB – Name of the content database that specific site lives in

· SiteID – Site GUID assigned for specific collection

· ServerName – Server where content database lives

· SiteURL – Full SiteUrl for each site collection in the Farm

Use <Configuration Database>

            SELECT (Select @@servername) as [ConfigServer],

(select db_name(dbid) from master..sysprocesses where spid=@@SPID) AS [config_db], a.name as [content_db],c.id as [SiteID], b.name as [servername], 'https://'+d.name+c.path as [Siteurl]

      from objects as a inner join

      sitemap as c on a.id=c.databaseid inner join

      objects as d on c.applicationid=d.id inner join

      objects as F on a.parentid=f.id inner join

      objects as b on f.parentid=b.id

where a.id in (select databaseid from sitecounts)

order by a.name

Content Script 2 

The following script grabs lower level web information for your site collections including the following columns:

· ServerName - ServerName content database is located on

· Content_DB – Content Database web lives in

· SiteID – Site GUID assigned for specific collection

· WebID – Web GUID assigned for specific Web

· WebURL – URL for web

· ParentwebID – Relative GUID in which web is a child of ( Null = Site Collection )

· WebTemplate – Template Web was provisioned as

· Language – Language web was provisioned as

· ProductVersion – Version of site

· DocumentCount – Number of documents that live within web

select distinct(select @@servername) as [servername], (select db_name(dbid) from master..sysprocesses where spid=@@SPID) as [Content_db], a.siteid, a.id as [WebID], a.fullurl as [WebURL], a.parentwebid, a.webtemplate, a.language, a.productversion, count(b.id) as [DocumentCount]

from webs as a inner join

      alldocs as b on a.siteid=b.siteid and a.id=b.webid

group by a.siteid, a.id, a.fullurl, a.parentwebid, a.webtemplate, a.language, a.productversion

order by a.fullurl

As you can see all 3 queries contain the site GUID, a content_db and a Servername… With that being said if you were to pull this data into a central repository you could then write SQL joins on those 3 columns and create all kinds of helpful views on your infrastructure. Additionally you can write SQL cursors around these scripts to pick up on table SCHEMA and loop through all of your content databases rather than manually executing on just one. The sky really is the limit. I will have an update to this with some example cursors and joins in the coming weeks.

Cory

Comments

  • Anonymous
    January 01, 2003
    Just to say we have a solution for that purpose... CardioLog, it a SharePoint Reporting solution. Have a look, Uri Intlock.

  • Anonymous
    January 01, 2003
    PingBack from http://www.d2design.be/bookmarking/bookmarking-the-web-w232008/

  • Anonymous
    January 01, 2003
    I can recommend MAPILab Statistics for SharePoint for tracking usage statistics of your SharePoint implementation. http://www.mapilab.com/sharepoint/statistics/ Integration with Active Directory, deeply detailed reports, reports on search, support of any topology.

  • Anonymous
    January 21, 2012
    Hi All, SharePoint already comes with usage analysis and health reports. If you want to report on Business content, you need to use something like the Share Point Data Miner. www.ipmo.com.au/.../Overview.aspx Adrian