The University PowerPoint Template

Report
So how did the revised student HESA return actually impact on an
institution?
We will:Compare the experiences of two institutions which either
use an ‘off the shelf’ or an ‘in-house’ student management
system. The sessions will aim to review the availability of
resources, technical expertise and business knowledge
needed within an institution; identify practical
implementation issues and describe on-going work.
The workshop will review and detail:





How the two institutions differ.
What worked?
What didn’t work?
The sharing of experiences – group work.
Feedback and summary.
Size and Shape information









20,431 active students in 2008/09
122 countries represented at Huddersfield.
2,338 internationally domiciled students
Circa 20,000 UCAS applications per year
Academic Staff/UG Student ratio, 1:19
1,920 people employed at 31st July 2008
3 university campuses
Lead institution for West Yorkshire LLN
Lead institution for PCET Consortium
Size and Shape information






24,004 active students in 2008/09.
131 countries represented at Sheffield.
4,636 internationally domiciled students.
Circa 35,000 UCAS applications per year.
Academic Staff/UG Student ratio, 1:14.
5,749 people employed at 31st July 2008.
Corporate Information System:-
Corporate Information System:-
Student ‘SITS-Vision’ system with Agresso
Financial and Professional Personnel HR, with
data linked to the data warehouse:













Programme and module management
Admissions and recruitment
Online student registrations and devolved (web)
student personal data maintenance
Other web-enabled functions eg results
Student Finance and Fees
Course/Module Assessment
Placements
Progress Records and Thesis Tracking
Ceremonies/Awards and Transcripts
Alumni
Management applications, external returns e.g.
HESES, HESA, TDA etc.
Agress0 Financial and Professional Personnel
HR
Corporate Data Model (data warehouse).
Student ‘Oracle Education System’, with SAP
Financials and HR, with data linked to the data
warehouse:












Programme and module management.
Admissions and recruitment.
Online student registrations and devolved (web)
student personal data maintenance
Student Finance and Fees.
Timetabling.
Departmental Assessment System.
Progress Records and Thesis Tracking.
Ceremonies/Awards and Transcripts.
Facilities Management.
Management applications, external returns e.g.
HESES, HESA, TTA etc.
SAP – Financials and HR with eRecruitment.
Corporate Data Model (data warehouse).
HESA Related Resources at Huddersfield:
Business Requirement





Project management
Data quality
Technical


Business analyst
Business liaison
HESA Specification

HESA Related Resources at Sheffield:-
XML Support


Business Requirement.
o Business analysts.
o Business liaison and systems
development.
HESA Specification.
o Project Management staff.
o Data Quality and MI Team.
Technical.
o CIS technical and data infrastructure.
o Oracle/XML/SQL programmers and
developers.
Implementation:
Project Staff
o Project Manager/Business Analyst
o Data Quality
o Ad hoc requirements
Implementation:
Project Staff.
o Project Manager – liaison.
o Business Analyst.
o Data quality work.
o Oracle programmer with XML expertise.
o Various CIS developers as required.
Implementation:
ASIS Development Group

ARO and School/Service staff

Regular progress monitoring to Deputy
Vice Chancellor and Senior Executive
Officer
Implementation:
Project Committee (Prince 2 Project Management).
o Policy decisions.
o Resource allocation.
o Guidance and support.
 Operational Sub-Group: Acquisition of new data (admissions,
student services, international
office, research office).
 Changes to business processes.
 Data Quality Review.
 Technical Sub-Group: Reference Data.
 SQL script design.
 CIS process changes.
 Oracle and XML outputs.
What worked?

Internal Liaison: Strengthened existing co-operation
between different areas
 Opportunity to remind operational staff of
wider impact of their work
 Strengthened work already carried out on
consistency of operations
 Gave ‘business case’ for certain operations
 Additional data quality checks leading to
further improvement in data quality
 Support from XML expert
 Majority of data in single system
What worked?

Internal Liaison:o Increased co-operation in addressing external
data requirements from across the
institution.
o Greater understanding by operational staff
on the wider impact of their work.
o Refocus on how the CIS student record was
operated so that there was renewed
consistency in its use.
o Further agreement on data quality
responsibilities across operational offices
(admissions and student registrations
offices).
o Improvements in overall data quality for both
internal and external users.
o Support from HESA in creating a manual OS
Aggregate Return in Excel, including XML
conversion.
What worked?

Third party software
 Programming work done for us thereby
allowing us to concentrate on data quality
and business process requirements
 SITS Forum enabled help from other HEIs

Process documentation


What worked?

CIS Developments:o Allocation of development resources.
o The management of new data fields and
reference data changes, which impacted on
the ‘live’ operational systems.
o Development of specialist algorithms e.g.
proportional load calculations.
o Overwriting student, programme and unit
system data.
o Creation of a schema database populated
with data errors allowing for easy analysis
and identification of records requiring
correction.
o Schema and XML – no (or very few) problems
as created locally (local expertise at hand).

HESA Liaison:o Realistic in the way they liaised over late
returns.
Forced us to sit down and improve what
internal documentation we already had
HESA Liaison


Accommodating with requests for
extensions
Reassurance that others were in the ‘same
boat’
What didn’t work?

Project Management
 Fixed deadlines
 Shifting specification and late changes to
business rules and validation kits
 Slowness of HESA guidance on interpretation
of specification – not always their fault
 Very devolved institution on academic side –
ensuring all staff involved in this understand
importance and implications of what they are
doing?
What didn’t work?

Project Management:o No flexibility to re-schedule/extend
timescales. All fixed to a national deadline.
o Unable to maintain development schedules
due to a shifting specification.
o Support resource from HESA – slow
responses; continued reference back to their
statutory customers for clarification on
requirement. Unfair on HESA Liaison and us.
o Excessive call on local HESA expert.
Revisions in specification tied up this vital
resource; interfered with scheduled analysis
of the full specification from HESA for supply
to programmers.
o Inappropriate lead times from HESA’s
statutory customers.
o Insufficient time to check that the local
specification had been created correctly.
What didn’t work?


3rd party software:o Timing of release of ‘hot fixes’ with necessary
updates for HESA processing
o Resources required to provide software too
reliant on certain individuals
o Changing specification added to delivery
problems
o Lack of wildcard functionality meant couldn’t
cross-check to HESES re-creation in same
way as in previous years
Data and Quality Issues: UCAS data for HESA (*J) – too late and poor
quality
 Validation/data quality issues raised by HESA
post-submission
What didn’t work?

CIS amendments:o Reference data at the heart of CIS.
o Built co-operation from operational areas,
until CIS had to be amended – just as
finalising recruitment and online registration
of students – but had no choice.
o Operational and MI reports required rewrites
(approx 300 operational reports checked).

Data and Quality Issues:o Need for unit records against all students.
o Relational structure of the HESA record does
not reflect operational reality. OWNSTU as a
unique student identifier?
o UCAS data for HESA (*J)
o UofA for student supervisors
What didn’t work?

HESA and their Statutory Customers:o Too much change in a single year
▪ New reference data and amendments
to existing.
▪ Changes to existing data fields.
▪ New data requirements.
▪ Different time scales of
implementation between HESA and
UCAS.
 The specification failed to deliver a stable
requirement; too many versions being
published (even past key delivery dates).
 Conflicting guidance on some key data fields
between HESA and funding council – each
cross referencing each other.
 Business rules and validation questions did
not always seem logical.
What didn’t work?

HESA and their Statutory Customers:o Too much change in a single year.
 New reference data and amendments
to existing.
 Changes to existing data fields.
 New data requirements.
 Different time scales of
implementation between HESA and
UCAS.
o The specification failed to deliver a stable
requirement; too many versions being
published (even past key delivery dates).
o Conflicting guidance on some key data fields
between HESA and funding council – each
cross referencing each other.
o Business rules and validation questions did
not always seem logical.
Please discuss and summarize your discussion points on the supplied paper.
Identify a member of the group who can feedback points at the end of the
session.

similar documents