6th Annual Higher Education Data Warehousing Conference
Session Description (Subject to Change)

Indiana Memorial Union, Indiana Univ., April 26-28, 2009 (Last updated 04/20/2009)

Monday, April 27

7:00 to 8:00 am - Alumni Hall - Continental Breakfast

8:00 to 9:00 am - Whittenberger - HEDW Keynote Address - John Milam, HigherEd.org

9:15 to 10:15 am - Break-out Session #1

Whittenberger
Designing Dashboards to Die For, John Rome, Arizona State University

As a followup to his presentation "Dashboards that Make a Difference" given at HEDW in Austin in 2007, John will share some of the lessons that Arizona State University has learned in the past 2 years on presenting and visualizing information through dashboards and other tools. Influenced by the likes of Wayne Eckerson, Edward Tufte and Stephen Few, the audience will see how their teachings and techniques can be applied in a higher education setting. See http://dashboard.asu.edu.

Frangipani
Using a Factless Fact and Bridge tables to model effective dated HR data, Sherri Woolard, Washington University in St. Louis

This presentation will examine how Washington University in St. Louis tackled the challenge of modeling effective dated HR data from their PeopleSoft ERP system to achieve accurate point in time and period of time reporting in their data warehouse. This technical discussion will focus on the following topics:

  • Discuss the concept behind the 'factless fact' star schema design.
  • Using a factless fact table to store unique effective dates from every HR dimension.
  • Distinguishing between 'person' and 'job' dimensions.
  • Using bridge tables to connect the appropriate dimension rows to the factless fact table.
  • Joining multiple appointments (jobs) to the factless fact table. How to avoid double-counting.
  • Eliminating the use of max effective date and max effective sequence subqueries commonly used in HR transaction systems.
  • ETL steps.

Georgian
Keeping Up or Keeping Ahead, Michael Wonderlich, University of Illinois

Now that you have your data warehouse and business intelligence environment built, where do you focus your attention next? Even before you are finished building your environment; you already have requests for changes coming in. There will always be changes requested; something is built incorrectly or a business process is changing or they want a new piece of information available. Keeping Up includes regular maintenance to your data warehouse and BI environment, but also the operational support to maintain ETL, system availability and answering questions. But at the same time, you want to capitalize on new technology and lead your institution to better decision making. Keeping Ahead enables you to guide your institution into new ways of making decisions and doing business. In this presentation we will examine the importance of being operationally stable and performing routine maintenance, but also the importance of being thought leaders in the areas of business intelligence and decision making. The goal is to find a balance to achieve both… Keeping Up and Keeping Ahead.

10:30 to 11:30 am - Break-out Session #2

Whittenberger
Implementing a Data Warehouse at Drexel University, Insiyah Jamal, Drexel University

Attend this session to learn what was involved in implementing a data warehouse at Drexel University. This session will cover:

  • requirements and validation phases
  • challenges and lessons learnt
  • benefits realized from the new tool now available to senior management at their finger tips.
It will also go over
  • an overview of the data warehouse
  • sample reports and analytics
This session should be useful to anyone planning a new implementation.

Frangipani
Using a Data Warehouse to Audit a Transactional System, Michael Glasser, UMBC

UMBC has developed a data warehouse process, using SQL, that audits and reports the changes made to certain transactional tables. The process identifies the records that were added or deleted yesterday, and any fields that were modified, showing both the old and new values. This process is currently used by the HR office to audit the data entry staff, related to changes to employee job records, bio/demo data, citizenship or federal/state tax records. This presentation will demonstrate the table setup required to audit particular tables. It will include the special steps necessary to handle effective dating, along with the generic SQL procedure used to audit any table structure, in both Oracle and SQL Server. It will show the SQL procedure used to email a summary of changes every night to the data manager for HR, and the Crystal Report used for the audit details.

Georgian
What I Get/Wish I Could Get from the Warehouse: The View from Institutional Research, Emily Thomas (convener), Stony Brook Univ; Linda Lorenz, Univ of Minnesota Twin Cities; Ed Schaefer, DePaul Univ; Linda Sullivan, Univ of Central Florida; Daryl Wright, Northern Kentucky Univ

If I can find co-panelists I propose a panel discussing IR reporting products that could be/could better be delivered through a business intelligence data source and bi reporting tools, the issues involved in changing the way the information is delivered, and related topics pertaining to how a bi data source can best support IR. Academic department profiles are an example on this campus of an IR product that could be subsumed by the bi data source, but not easily.

11:30 am to 12:45 pm - Alumni Hall - Lunch

1:00 to 2:00 pm - Break-out Session #3

Whittenberger
Creating and Maintaining Cross Functional Data Marts, Ted Bross, Princeton University; (Panelists - TBA)

Many organizations have built their data warehouses using a bottom up approach. In reality, they have created multiple data marts that represent data contained within individual operational systems such as Admissions, Financial Aid, Registration, Housing and University Finances. These marts provide users of their source systems with an easy way to create both standard and ad hoc reporting environments. However, despite the relative success of such an approach, there is an ever-increasing need to combine data from the disparate data marts into a more consolidated view of the student or of the organization. As such, some organizations have begun to create cross-functional data marts that combine the data, and in the process, have encountered a new set of issues and challenges. Some of these challenges include security, timing and data integrity. Four institutions that have attempted to build these cross-functional data marts will provide their experiences, both good and bad, in this panel discussion.

Frangipani
Packaging data to ease operational pain: a case study for payroll reconciliation, Mark Pollard, University of Illinois

At the University of Illinois – as with other institutions – reconciling payroll is a critical process and can be a real nightmare for larger departments. Many of our departments knew they had payroll issues, didn’t know how to address them. Making matters worse, payroll was one of our first Warehouse deployments, released 5 years ago and designed more for flexibility than for ease of use. While a few of the larger units with their own technical resources had created innovative solutions to help them identify payroll issues, most units lacked both the resources and the expertise. Even those who had been able to create the complex reports needed using Business Objects had problems, because expanding data volumes resulted in reports that ran forever if they even finished at all. To address these challenges, we built a new data mart that organizes payroll information to make reconciliation much easier, including adding some derived fields to help pinpoint problems. Beyond just a new reporting tool, this product has also allowed us to highlight some best practices for how departments can improve their business process for reconciliation. This presentation will present a case study describing the entire project from end to end.

Georgian
Challenges of an Expansive Business Intelligence Project, Helen Ernst, State University of New York; Dan Brint, SUNY ITEC

The State University of New York has embarked on a system wide BI initiative. With 64 campuses across New York the project has the challenges of managing a large scale, complex project. We are proud to reveal our successes and share the more challenging tasks we have faced. The SUNY Business Intelligence Initiative (SBII) responds to a vision of centralized (campus based and University-wide based) systems to provide improved Decision Support throughout the University. The fundamental goal of the SBII is to: 1. integrate administrative data into an accurate and reliable information resource 2. support planning, forecasting, and decision-making processes at campuses and System Administration To achieve this goal the SBII has been formed from the SUNY Alliance for Strategic Technology (AST). The SBII primary focus is the implementation of Decision Support by working with functional area groups composed of members of the University community. Decision Support is the core of SBII, responsible for designing, implementing, and delivering solutions to address business problems. Phased Releases SUNY has found success using small Proof of Concept projects targeting specific reporting needs. Demonstration will show areas where we have produced and successfully released dashboards.

  • Institutional Research
  • Community College Financials
  • Library




2:15 to 3:15 pm - Break-out Session #4

Whittenberger
From Data Model to Loaded Tables: Approach and Learnings, Joseph Kerr, University at Buffalo

The presentation will describe the framework and approach utilized by UB for developing data models, blueprints, ETL and managing the target star schema tables. UB made extensive use of Excel and Sharepoint to document and manage the process, track and assign issues and communicate progress. The ETL architecture includes 4 stages of tables that served well to encapsulate work, increase manageability and provide protection of the production data while reducing down time for loads. The objectives of the presentation are to share an approach to data modeling and transformation, discuss lessons learned and provide ideas to consider when undertaking similar work.

Frangipani
DAC! SMAT! GO! Or "A Solution to Implementing Data Security Standards", Anja Canfield-Budde, University of Washington

In 2008 we presented first plans of the University of Washington for implementing fine-grained data access controls in a SQL server enterprise data warehouse environment. This year, I will introduce the technologies and tools we have developed and now use in production to administer row- and column-level security to SQL server objects. The DAC (Data Access Control) applies different levels of access depending on the role of the user. The SMAT (Security Metadata Administration Tool) is an intuitive web front-end for configuring this data access. In this talk I will also cover the federal, state, and university data security standards our products are based on, share what we have learned by building and deploying these technologies, and – most importantly – show how other institutions might benefit from our work by adopting the practical solutions we provide. The Boooom? Come see for yourselves!

Georgian
Connecting Data, Richard Howard & Linda Lorenz, University of Minnesota; Gerald McLaughlin, Depaul University

Increasingly, colleges and universities are turning to external data sources to better understand their institutional effectiveness and relative standing among peers or nationally. These external data sources have been developed to meet the specific needs of their owners, both in terms of their content and structure. Designing interfaces is a challenge that requires interfacing nationally accepted coding structures like CIP Codes and IPEDS UnitIDs with your institutional data structure. On our campuses, data are used to support three kinds of activities: operations, research, and reporting. Creating comparable information, when integrating institutional data with external data, requires an understanding of census dates and their use in identifying an “official” data set or flag. During the past year, we have integrated data from the University of Minnesota enterprise system, the AAUDE Warehouse, IPEDS, and the National Student Clearing House. In this presentation we will discuss what was needed to accomplish this data "connection".

3:15 to 3:45 pm - Afternoon Break (in Break-out Session rooms)

4:00 to 5:00 pm - Break-out Session #5

Whittenberger
MDM, Metadata, and BI: What do we mean and how do they relate?, Tim Wilson, University of Notre Dame

Two of the top five Google search results for "master data metadata" are "Master Data is not Metadata," from MDSN and "Master Data is Metadata," from Information Management Magazine. So what *is* the relationship between master data management and metadata management? How are institutions of higher learning managing these assets? What value is derived, operationally and strategically, through their BI efforts?

Frangipani
Building for Analytics: Critical Success Factors, Joanne Wilhelm, Indiana University; John Rome, Arizona State University; Jim Singleton, Cornell University; Jeff Stark, Rensselaer Polytechnic Institute

Institutions of higher education are facing increased pressure to build analytic capability to drive strategic initiatives and provide accountability. Whether you are just beginning to think about a dashboard and your initial dimensional model or you have delivered KPIs from an enterprise data warehouse, the same question is key. What are the common factors associated with successful analytics in higher education? This session will briefly review the research on success factors for delivering analytics and effective business intelligence. Panel participants will review the progress toward analytics at their institutions, identify the unique challenges and opportunities in their environments, and identify success factors. Published research findings will serve as a framework for the panel discussion. The goal is a lively discussion of diverse experiences in building for analytics and improved performance outcomes at our individual institutions.

Georgian
Can a Wiki Be Used to Define Data in a Data Warehouse?, Daniel Riehs, Boston College

Those who attend this presentation will learn about an ongoing experiment at Boston College to collaboratively define data warehouse data with a wiki. Since BC's data warehouse started out primarily as a reporting mechanism for the users of operational systems, very little effort was put into documenting data definitions. As the data is increasingly used not for operational reporting but for analysis across University subject areas, clearly-documented data definitions are quickly becoming a necessity. The wiki is not currently being used to develop data standards. It is simply a place where warehouse users can document the current state of the data. Most of the information in BC's warehouse originates in a 30-year-old legacy student system that is full of idiosyncrasies and inconsistencies. Since no single person understands all of the data in the warehouse, a wiki was seen as an inexpensive way to collaboratively document the knowledge of many different data warehouse users. The presentation will describe difficulties in setting up the wiki and persuading people to use it, as well the role that it is playing in a recently-begun data warehouse redesign and improvement project.

5:00 to 6:00 pm - Whittenberger - HEDW Business Meeting

6:00 to 10:00 pm - Area Restaurants - Walking Wounded (Bird-of-a-Feather Dinners)

Tuesday, April 28

7:00 am to 8:00am - Alumni Hall - Continental Breakfast

8:00 to 9:00 am - Break-out Session #6

Whittenberger
Three Elements for Successful Reporting at Virginia Tech, Alan Moeller, Virginia Tech

This presentation provides an overview of Data Warehousing at Virginia Tech. It defines the purpose and goals of the data warehouse, describes the composition of the data warehouse development team, and reviews some of the early actions and decisions that helped the project get off to a good start. It gives insight to the ETL process and some of the quality checks that ensure data accuracy. It concludes with an overview of our training program for users of the data warehouse and will include a demonstration of some of the dashboards that have been developed to support University decision processes.

Frangipani
Dimensional Modeling Workshop - Art or Science (Part 1), Ora Fish & Jeff Stark, Rensselaer Polytechnic Institute

This workshop aims to demystify dimensional modeling with an attempt to demonstrate that modeling is more of an art than a science. The participants will be given a quick overview of the different dimensional modeling techniques and their applications. The audience will break into working groups to design their own conceptual data model based on the pre defined requirements. Collective review and discussion of group’s designs will follow. Along with a review if Rensselaer's design and demonstration of how it is being utilized.

Georgian
Reviewing a Successful Data Warehouse Change Management Program, Jennifer Selk & Michael Wonderlich, University of Illinois

Change Management is an integral part of an Enterprise Data Warehouse (EDW). The University of Illinois, Decision Support group follows a comprehensive process to determine the feasibility and appropriateness of changes to the production data warehouse and Business Intelligence (BI) environment. This presentation will provide an overview and detailed review of our Change Management program by examining our entire workflow process; starting with the original request and tracking to final implementation. We will focus on many areas such as the types of changes, how they are scheduled, who is involved, frequency of changes and a review of the typical steps of the workflow.

9:15 to 10:15 am - Break-out Session #7

Whittenberger
Building BI Blueprints with Visio, Push-Pins, and String, Aaron Walz & Beth Ladd, University of Illinois

Academic institutions are extremely diverse organizations with many different types of information consumers who have a wide variety of needs. However, understaffed BI teams typically have limited time and resources. Under these constraints, how can you create a BI strategy and deliberately build toward a comprehensive solution while still delivering short term value? We will demonstrate a method that begins with systematically analyzing your business processes and related information needs for each customer type. Next, blueprints are created to define a set of front-end BI solutions and what underlying data marts are needed to support them. A key step in this process is determining the right flavor of BI for each need, avoiding one-size-fits-all solutions. It also helps identify reusable components to minimize rework. Finally, these blueprints make it possible to define manageable chunks of work that can be built separately but that will ultimately fit together into an integrated solution.

Frangipani
Dimensional Modeling Workshop - Art or Science (Part 2), Ora Fish & Jeff Stark, Rensselaer Polytechnic Institute

(Continuation of the 8:00am break-out session. See earlier session description for details.)

Georgian
Data Warehousing and Changed Data Capture (CDC), Madan Dorairaj & Brian Long, Princeton University

This session will highlight the thought process that led us to use CDC as part of our overall ETL strategy. We will begin by discussing the business needs that were driven by large quantities of data that made full nightly refreshes impractical. This will be followed by the technical aspects which drove our decision. We will cover lessons learned, the alternate methods that were investigated as well as the final CDC implementation.

10:30 to 11:30 am - Break-out Session #8

Whittenberger
Getting It Right: Successfully Deploying an Enterprise Reporting Environment, Kathy Luker, University of Wisconsin

The old adage, “If you build it, they will come,” may apply to cornfield ballparks, but rarely does it hold true for attracting users to reporting systems! The University of Wisconsin has learned three key ingredients to engaging users: making access really easy and job focused; making training relevant to common business practices; and employing direct communication with users. UW-Madison’s web-based Query Library enables campus staff to access information on their own schedules, without having to write queries, so that access to information to improve organizational effectiveness is easy, affordable, and widely available. Using Hyperion (https://authhub.wisconsin.edu/?app=Hyperion) as the door to the University’s data warehouse, the Query Library contains end-user written and tested queries that broadly meet campus information needs around important business processes. In this presentation we’ll demonstrate how an easy, automated authorization process, scenario-based training, direct communication, and user feedback are essential to making decision support a widely-used organizational learning tool that strengthens employee skill sets, improves processes, reduces cycle time and increases staff efficiency.

Frangipani
Dimensional Modeling Workshop - Art or Science (Part 3), Ora Fish & Jeff Stark, Rensselaer Polytechnic Institute

(Continuation of the 8:00am break-out session. See earlier session description for details.)

Georgian
Vendor Relations - Achieving the Win-Win, Scott Wiggans, University of Denver

A humorous and informative "other side of the table" perspective from a RECOVERING consultant and software executive.

11:30 am to 12:45 pm - Alumni Hall - Lunch

1:00 to 2:00 pm - Break-out Session #9

Whittenberger
Technical "lightning talks", Mike Deutsch, McGill University; Helen Ernst, State University of New York; Madan Dorairaj, Princeton University; Ian Wall, Harvard University; Vicky Shaffer, Virginia Tech

Lightning Talks are a show-and-tell format currently popular in the technology/web community, in which a handful of short presentations (~5-10 minutes, including Q&A) are compressed into a single regular time slot. The “sharing” format makes them a natural fit for HEDW and its members. Compared with standard conference talks, they feel less like a lecture and more like a forum. They are an easy first step for new speakers and a good way to mix bite-size or late-breaking issues into a conference where most sessions have a very long-term perspective. Lightning Talks are also an effective catalyst for peer conversations; by giving a greater number of people just a little bit of exposure, they increase the likelihood that like-minded members will find each other.

Frangipani
Biting the Serpents Tail: How to Turn Assessment into Improvement, Frank McCluskey, Dave Becher, Melanie Winter, Liz Wallace, American Public University System

Custom built by a third-party vendor, our data warehouse is updated daily and enhanced as needed to provide analytical information from our unique ERP system. The data warehouse is used as a tool to ensure that information is used for continuous improvement on a regular basis. Findings from the data warehouse continuously improve on our policies, procedures, systems, and services at the University. Built with cooperation from all departments of the institution, the data warehouse is regularly used for our triennial program review process, the tracking of quantitative goals for the management team, and the regular dissemination of data across all constituents of the institution. On a regular basis, data warehouse findings are incorporated into ongoing conversations, decision-making processes, and strategic/budget planning process. The sharing of this information promotes the continuous improvement of the University and assists in ensuring quality at all levels of the institution.

Georgian
A First Look at Data Warehousing Kuali Financial System Data, Dylan Cooper, The University of Arizona

The University of Arizona is implementing Kuali Financial System (KFS), a community-source financials system designed specifically for higher education by member universities of the Kuali Foundation. KFS comes with very little built-in reporting and no accompanying data warehouse. Additionally it is in full use at only one institution, Strathmore University in Kenya. I will discuss the insights gained from our first few months of design and implementation of a data warehouse of KFS data. (This warehouse is integrated into a PeopleSoft EPM instance containing employee and student data.) In particular, I will discuss aspects of the KFS data model that are particularly relevant to the warehouse data model and ETL jobs, the design decisions we have made, the models we have developed so far, and any particularly useful implementation details.

2:15 to 3:15 pm - Break-out Session #10

Whittenberger
Business Intelligence for $7 A Day, Doug Price, Miami University

Getting started with Business Intelligence during a financial crisis can be very difficult. However, that could be the very time BI is needed the most! Miami University has jumpstarted a Business Intelligence program from the ground up by creating an Enrollment star schema to use as a Proof of Capability. A Proof of Capability is different from a proof of concept. BI is a proven concept, but a proof of capability shows how BI will work with an institution’s own data, answering institutional questions, on institutional machines. This can be done without investing large amounts of money and in a relatively short time. Higher education’s informational needs are very different from the “bike shop” demonstrations traditionally shown by BI vendors, which do not allow stakeholders to visualize the benefits of BI. This and the high costs of BI suites and ERP BI solutions will prohibit any institution from jumping into the BI pool. This presentation will explain how a POC jumpstarted BI at Miami University for less than $50,000 initial investment including: the original push for BI, where we got help from, the reason why enrollment was chosen, the timeline of the project, publicizing the POC and next steps.

Frangipani
End-to-End Business Intelligence Solution using Microsoft and IStrategy, Kim Berlin, Stony Brook University

Learn how Stony Brook University decided to take advantage of the Microsoft Business Intelligence technologies as an end-to-end solution for database modeling, ETL, report design, end-user interactive tools and delivery. The starting point of our data warehouse was acquiring the HigherEd data models, ETL Integration Services projects, Analysis Services databases and ProClarity reports from Istrategy Solutions. We customized Istrategy’s delivered product and extended the reporting and delivery capabilities through SQL Server Reporting Services, Report Builder, PerformancePoint dashboards , Excel Services and Microsoft Office SharePoint Server 2007. This session will illustrate the capabilities of Microsoft’s BI products, underlining the ease of design, development and flexibility of this productive and functional solution.

Georgian
A Data Warehouse Implementation Experience with OBIEE for Institutional Research, Banu Solak, University of Massachusetts-Amherst

University of Massachusetts Amherst has been implementing the PeopleSoft Campus Solutions Warehouse which is a part of Oracle’s PeopleSoft Enterprise Performance Management (EPM) Suite. In this presentation, we will briefly explain the architecture and problems encountered during our implementation from an Office of Institutional Research perspective. We will include examples on report creation/ conversion on Answers and use of Dashboards, in addition to discussing custom design needs of OIR and ways to integrate custom tables and census files into this solution. We will conclude with the lessons learned from this experience.