CA PPM Jobs Reference

HID_ref_jobs_reference
ccppmod142
HID_ref_jobs_reference
Clarity Project and Portfolio Management (PPM)
 includes stock jobs. As an administrator, use the following list of jobs, descriptions, and parameters for running jobs successfully.
 
Autoschedule Investment Job
This job creates or overwrites tentative schedules by automatically scheduling tasks that are based on task dependencies, constraints, priority order, and related date and resource logic. This job can be run concurrently with other instances of this job.
Parameters
  • Investment
    Defines the name of the investment to schedule.
  • OBS Unit
    Defines the specific OBS unit to schedule. All projects that are contained in the OBS unit and for which you have the Project - View Management right are auto scheduled. If both a project and an OBS unit are selected, the project is ignored and the OBS unit is used.
  • Autoschedule Date
    Defines the date from which to schedule tasks.
  • Ignore Tasks Starting Before
    Defines the date before which tasks are ignored during auto scheduling.
  • Ignore Tasks Starting After
    Defines the date to begin excluding all tasks from the schedule.
  • Resource Constraints
    Indicates if this job considers resource availability during auto scheduling.
    Default:
    Selected
  • Schedule from Finish Date
    Indicates scheduling the job backwards from the finish to the start date, rather than from the start date to the finish date.
    Default:
    Cleared
  • Subnets
    Indicates calculating the critical path for the entire project during auto schedule.
    Default:
    Cleared
  • Honor Constraints on Started Tasks
    Indicates for the job to schedule the remaining work on this task according to its normal auto schedule logic, including any task constraints and dependencies.
    Default:
    Cleared
  • Schedule Assignments on Excluded Tasks
    Indicates changing the task assignment dates for the job, as long as the new dates stay within the existing start and finish dates for the task.
    Default:
    Cleared
  • Start Successors on Next Day
    Indicates starting all successor tasks for the job on the next day.
    Default:
    Cleared
  • Publish After Scheduling
    Indicates if changes made to the tentative plan are automatically published to the Plan of Record at the end of auto scheduling. When selected, the tentative plan is deleted and the project is unlocked.
    Default:
    Cleared
Clean User Session Job
This job removes the session-based user data stored in 
CA PPM
for the resource logged in. The criteria for removing the data is the creation date/time of the data, along with the length of the session expiration time must be before the date/time this job runs.
User data contains references to the resource logged in to 
CA PPM
and any session-based data, such as Shopping Carts and Search Results that have persisted.
Requirements
None
Restrictions
This job cannot run concurrently with other jobs including other instances of this job.
Parameters
None
Copy Cost Plan of Record Charge Code with Cost Type Job
This job creates a copy of a plan of record and adds Cost Type as an extra grouping attribute to the existing grouping attributes. This job is intended to process only investments that were created before Release 13.2.
This job completes the following tasks:
  • The job processes only plans of record that have Charge Code as one of the grouping attributes. The job copies the current plan of record and adds Cost Type to an existing set of grouping attributes. The newly added Cost Type grouping attribute is set to either Capital or Operating on every line. How the attribute is set depends on the charge codes you designate as Capital or Operating in the job parameters. If the charge code for a line is selected in the Capital Charge Code mapping, the cost type of the line is set to Capital. Otherwise, in all cases (including a null charge code), the cost type for the line is set to Operating.
  • The job creates a cost plan with the following name and ID:
    • The new cost plan is named
      System Converted Cost Plan
      .
    • The new cost plan ID is created using the following format:
      COST_PLAN_CONV_<
      Year-Month-Day (####-##-##)
      >.
      For example: COST_PLAN_CONV_2013-08-15.
  • If the check box named
    Set New Cost Plan as Plan of Record
    is selected, the new cost plan becomes the plan of record.
  • When the
    Copy Latest Approved Budget Plan
    check box is selected, the following rule applies: The current approved budget plan is copied only if it has a charge code selected as a grouping attribute. The newly created budget plan is marked as the approved and the current budget plan. If there is no approved budget (only a submitted budget plan), the job does not copy the submitted budget. If there are many approved budget plans and a submitted budget, the job copies the current approved budget plan and marks the newly created plan as the current approved budget plan.
    The rules to set the value of newly added Cost Type grouping attribute are same as explained previously for the newly added cost plan.
    The job creates a budget plan with the following name and ID:
    • The new budget plan is named System Converted Budget Plan.
    • The budget plan ID is created using the following format: BUDGET_PLAN_CONV_<Year-Month-day>.
    The newly copied budget is set as the current budget. The newly copied cost plan is made the plan of record only if the
    Set New Cost Plan as Plan of Record
    check box is selected.
  • If no failure occurs during the processing of an investment plan, the job marks the investment as successfully processed. Successfully processed investments are skipped in subsequent executions of the job even if these investments are selected again through an OBS Unit. Successfully processed investments stop appearing in the Investment browse on the job input screen. Therefore, it is important to select the correct options before submitting investments to the job.
This job is intended only to update financial plans for investments so that capital cost is displayed. The job is not intended as an ongoing function for bulk updates. We recommend that you deactivate the job after you have enabled investments for capitalization and have copied their plans of record.
The statistics of a job execution are printed in a BG log file for the job. You can open and read the log file. The log file contains information such as the number of processed, skipped, or failed investments.
Requirements
A job that is named
Enable Capitalization
must be run on an investment, before you can select the investment for this job.
Restrictions
If your investment cost plans contain a large amount of detail, process a few investments at a time.
Parameters
  • Investment OBS
    Specifies the OBS whose investments you want to process. The job processes all investments for the OBS and its descendants. Use this option if you do not want to select individual investments for processing.
  • Investment
    Specifies the individual investments that you want to process.
  • Capital Charge Code
    Specifies the charge codes that you want to designate as the cost type Capital. If you do not indicate a value for a specific charge code, the default cost type Operjob that is namedting is applied.
  • Operating Charge Code
    Specifies the charge codes that you want to designate as operating cost. If you do not indicate a value for a specific charge code, the default cost type Operating is applied.
Create Business Objects Users Job
This job creates missing, active 
CA PPM
non-LDAP users on Business Objects and adds these users to the CA-PPM-Reporting-User User Group in Business Objects.
Users are created with a randomly generated password. The Business Objects Administrator needs to set the the password for each user before they can log in to BusinessObjects InfoView.
You can run this job immediately or on a scheduled basis.
Requirements
Configure and execute Business Objects.
Parameters
None
Create and Update Jaspersoft Users
The job creates advanced reporting users if they do not already exist in Jaspersoft. The job also passes and updates user properties from 
CA PPM
to Jaspersoft. It updates license types in 
CA PPM
and assigns Jaspersoft Roles based on assigned Advanced reporting access rights.
The job also completes the following tasks.
  • Disables users in Jaspersoft if they are inactivated or locked in 
    CA PPM
    . In case a valid 
    CA PPM
    user is disabled or deleted by the Jaspersoft superuser, the next run of the job makes the user active again in Jaspersoft.
  • Creates a folder for each new Jaspersoft user with their user names under Users in the CA PPM organization. The folder is private to the specific user and can only be accessed by either that user or the Jaspersoft administrator. On upgrade, the contents of the folder will remain intact.
Run this job after creating or updating the advanced reporting users in 
CA PPM
to synchronize the users with Jaspersoft.
Datamart Extraction Job
This job extracts data from the transactional database tables and stores them in easily understood reporting tables. These tables are the foundation for most reports that are delivered with 
CA PPM
and are used for any custom reports.
For Microsoft SQL Server 2005 with the SQL Server Agent enabled, you add the 
CA PPM
administrator account to the SQLAgentUserRole role to run datamart extraction jobs.
See the Microsoft SQL Enterprise Manager documentation for the details about adding user accounts to the SQLAgentUserRole role.
Requirements
(Recommended) Run this job after the Time Slicing job.
Define daily time slice definitions with a start date at least three months before the current date. Remove any datamart extraction options as a datamart setting that is required for the job. This setting is for the Datamart Rollup - Time Facts and Time Summary Job.
(Optional) Set up Datamart stoplights.
Configure the following Datamart settings:
Datamart Currency
  • Datamart Entity
  • Datamart Extraction Options
  • Project OBS Mapping
  • Resource OBS Mapping
Restrictions
This job cannot run concurrently with the following jobs:
  • Datamart Extraction
    Other instances of this job cannot exist.
  • Datamart Rollup - Time Facts and Time Summary
  • Delete Investments
  • Import Financial Actuals
  • Post Timesheets
  • Post Transactions to Financial
  • Recalculate Cost of Capital Fields
  • Time Slicing
Parameters
None
Datamart Rollup - Time Facts and Time Summary Job
This optional job populates the following time facts and time summary tables for resources who want to develop custom reports:
  • NBI_PM_PT_FACTS
  • NBI_FM_PT_FACTS
  • NBI_RT_FACTS
  • NBI_PM_PROJECT_TIME_SUMMARY
  • NBI_FM_PROJECT_TIME_SUMMARY
  • NBI_RESOURCE_TIME_SUMMARY
The populated data is not used in standard 
CA PPM
reports.
Requirements
(Recommended) Run this job after the Datamart Extraction job.
Restrictions
This job cannot run concurrently with the following jobs:
  • Datamart Rollup - Time Facts and Time Summary
    Other instances of this job cannot exist.
  • Datamart Extraction
  • Delete Investments
Parameters
None
Delete Investments Job
This job permanently deletes investments-projects, programs, assets, and other work-and their associated data including investment hierarchy, financial data, tasks, timesheets, documents, and time periods when marked for deletion.
Change the project track mode from
Clarity
to
Other
if the assignments have actuals. If you do not change the track mode, the project is not deleted.
Requirements
  • The
    <Investment> - Delete
     access right.
  • To delete projects, they must be inactive, marked for deletion, and cannot contain time entries.
  • (Recommended) Back up all projects before running this job.
Restrictions
This job cannot run concurrently with the following jobs:
  • Delete Investments (other instances of this job cannot exist)
  • Import Financial Actuals
  • Post Timesheets
  • Post Transactions to Financial
  • Time Slicing
Parameters
None
Delete Log Analysis Data Job
This job removes the 
CA PPM
log analysis-related data. The criteria for removing the data is the LOG_DATE field on each of the log analysis tables.
This job is scheduled automatically to run at 1:00 AM each day.
Requirements
None
Restrictions
None
Parameters
  • Log retention in days
    Specifies the number of days that data is retained in the tables that are related to analyzed access logs. The default value for this parameter is 30 days.
  • Session token retention in days
    Specifies the number of days that data is retained in the table LOG_SESSIONS. The data specifically stores a mapping of the 
    CA PPM
    session token to CMN_SEC_USERS.ID for analysis and audit purposes. The default value for this parameter is 14 days.
Delete Process Instance Job
This job deletes a process instance with a status of Done or Aborted.
Requirements
None
Restrictions
This job cannot run concurrently with other instances of this job.
Parameters
  • Process Name
    Defines the name of process instance to delete.
  • Process Instance Status
    Defines the status of process instance you want to delete. Select a status from the drop-down.
    Values:
    • Aborted
    • Done
  • Finish Date From
    Defines the date from which all completed process instances within the selected date range are deleted. Specify one of the following dates:
    • Specific Date -- Enter a date or use the Calendar tool to select the date.
    • Relative Date -- Select the appropriate relative date from the drop-down.
  • Finish Data To
    Defines the date to which all process instances within the selected data range are deleted. Specify one of the following dates:
    • Specific Date -- Enter a date or use the Calendar tool to select the date.
    • Relative Date -- Select the appropriate relative date from the drop-down.
  • Object Type
    Defines the object type of process instance you want to delete, such as a project name. Enter the name of the object to process, or use the search tool to select an object.
  • Initiated By
    Defines the name of the user who initiated the process instance you want to delete. Enter the resource name or use the search tool to select a resource.
    Enter the OBS name or use the search tool to select the OBS name or OBS unit.
Enable Capitalization Job
This job enables you to set expenses in investments that were created before Release 13.2 as either Capital or Operating. Before Release 13.2, all expenses were operating expenses by default. The job sets the cost type as either Capital or Operating on the following items:
  • Investments
  • Investment tasks
  • Investment transactions
After you run this job, run the
Copy Cost Plan of Record Charge Code with Cost Type
job to complete the capitalization feature setup.
The job is not intended as an ongoing function for bulk updates. We recommend that you deactivate the job after you complete the capitalization feature setup for your investments.
What the Job Does
  • Sets the
    investment
    cost type to either Capital or Operating.
    If the charge code of the specified investment is selected in the Capital Charge Codes browse field on the job screen, the cost type of that investment is set to Capital. Otherwise, if the charge code of the specified investment is selected in the Operating Charge Code browse field or if the charge code is not chosen in either of the browses, the cost type of that investment is set to Operating.
  • Sets the
    task
    cost type to Capital or Operating for each task according to charge codes.
    If the charge code of the specified task is selected in the Capital Charge Codes browse field on the job screen, the cost type of that task is set to Capital. Otherwise, if the charge code of the specified task is selected in the Operating Charge Code browse field or if the charge code is not chosen in either of the browses, the cost type of that task is set to Operating.
  • Sets the
    transaction
    cost type to Capital where appropriate.
    If the charge code of a transaction for a selected investment maps to a capital charge code, the cost type of that transaction is set to Capital. The cost type is not changed for all other cases.
    During an upgrade, all transactions are set to the Operating cost type by default.
  • Sets the
    team
    Capitalization % value to the value you enter in the Capitalization Percent parameter.
    This value is used to calculate the percentage of operating cost and capital cost for employee allocations.
Example
The following table shows how the job processes two investments and the cost types that are assigned to tasks and transactions. In the job parameters, the following selections are specified.
Investments
: eCommerce Portal Implementation, CRM Enhancement
Capital Charge Codes
: eCom, Engineering, CRM
Operating Charge Codes
: Operations, Maintenance
Investment
 
 
Charge Code
Cost Type Assigned
How the Cost Type Was Assigned
eCommerce Portal Implementation
 
 
eCom
Capital
 
 
Tasks
Development
Engineering
Capital
The Development task has a charge code of Engineering which is designated as a Capital charge code; therefore, the task is assigned a Capital cost type.
 
 
Training
Travel Expense
Operating
The Travel Expense charge code was not selected in either the Capital or Operating Charge Code parameter field; therefore, the Training task receives the default cost type of Operating.
 
 
Miscellaneous
Null
Null
The cost type for the Miscellaneous task is NULL because charge code is NULL on the task.
 
 
 
 
 
 
 
Trans-
actions
Software Purchase
Engineering
Capital
This transaction has a charge code of Engineering which is a Capital charge code.
 
 
Hardware Purchase
Null
Operating
The charge code was not selected in any charge code fields; therefore the job does not change the value of the cost type. However, the default is set to Operating for all transactions during an upgrade.
 
 
Attended Conference
Travel Expense
Operating
The charge code was not selected in any charge code fields; therefore, the job does not change the value of the cost type. However, the default is set to Operating for all transactions during an upgrade.
CRM Enhancement
 
 
CRM
Capital
 
 
Tasks
Resolve bugs
Maintenance
Operating
The Resolve Bugs task has a charge code of Maintenance which is designated as an Operating charge code; therefore, the task is assigned an Operating cost type.
 
 
Help Desk
Operations
Operating
The Help Desk task has a charge code of Operations which is designated as an Operating charge code; therefore, the task is assigned an Operating cost type.
The statistics of a job execution are printed in a BG log file for the job. You can open and read the log file. The log file contains information such as the number of investments that were processed, skipped, or failed.
Requirements
None.
Restrictions
The length of time the job runs depends on the amount of data that is associated with the investments you select. If you have investments with a great amount of associated data, we recommend limiting the number of investments for a job run. We recommend running the job immediately after the upgrade before any modifications to investments, tasks, or transactions are made.
Parameters
  • OBS Unit
    Specifies the OBS units whose investments you want to process. The job processes all investments for the OBS units and their descendants. Use this option if you do not want to select individual investments for processing.
  • Investment
    Specifies the individual investments that you want to enable to display capital and operating expenses.
  • Capital Charge Code
    Specifies the charge codes that you want to designate as the cost type Capital for the selected investments (based on the OBS unit or individual investments).
  • Operating Charge Code
    Specifies the charge codes that you want to designate as operating cost for the selected investments (based on the OBS unit or individual investments).
  • Capitalization Percent
    Specifies the amount of expense for an investment that is designated capital expense. This number is used to calculate operating and capital expense.
Execute a Process Job
This job executes a process that is not associated with any object.
Requirements
None
Restrictions
  • It can only execute the processes that you have access.
  • It can only execute non-object based processes (that is, processes without a primary object).
Parameters
  • Process ID
    The ID of the process to execute.
Generate Invoices
This job takes a set of unprocessed transactions, matches them with best-fitting chargeback rules from the investment hierarchies, and applies the rule to generate chargeback transactions. An invoice header is generated for every unique combination of department and fiscal time period.
If an invoice exists for the department-fiscal time period combination, and if the invoice is locked, the job cannot generate new chargeback transactions against the invoice. Instead new chargeback transactions are created for an invoice in the next, unlocked fiscal time period.
Your finance manager can view or monitor any errors or warnings that are caused by running this job.
Requirements
  • Set up a financial structure, including entity, financial classes, and rate matrices.
  • Define credit and overhead rules.
  • Define investment debit or standard debit rules to process chargebacks.
  • Post the WIP transactions.
See
Financial Management
for more information.
Restrictions
This job cannot run concurrently with other instances of this job.
Parameters
  • Entity
    Defines the entity for which to generate invoices.
  • Regenerate
    Indicates if this job processes all transactions or only new and updated transactions.
    Options:
    • All. Regenerates the chargeback transactions by applying existing rules.
    • New Transactions/Adjustments. Processes only new unprocessed transactions or adjustments.
  • Lock/Submit
    Indicates if all invoices from prior periods are automatically locked and submitted. If an invoice is already locked (for example, a user may be actively reviewing it), automatic submit may not occur.
    Options:
    Lock/Submit or None
  • Override Manual Locks
    Indicates if this job can temporarily unlock previously generated invoices and regenerate them.
Import Financial Actuals Job
This job updates task assignments with the actuals entered in financial transactions and WIP adjustments. The assignment ETC is decremented through the transaction entry date (as in a Fixed Loading Pattern). ETC in the future is not decremented even if the actual amount is greater than the ETC amount for the period being posted.
Requirements
None
Restrictions
This job cannot run concurrently with the following jobs:
  • Import Financial Actuals (Other instances of this job) cannot exist.
  • Delete Investments
Parameters
None
Index Contents and Documents for Searches Job
This job indexes the search content (such as activities and action items) and documents.
Requirements
None
Restrictions
This job cannot run concurrently with other instances of this job.
Parameters
None
Investment Allocation Job
This job updates the ETCSUM and EACSUM fields for investments and bases its calculations on the resources that are allocated to the investment. The job calculates the sum of the ETC values for all the assignments for the investment and stores the value in the ETCSUM
field. The job also calculates the sum of the EAC values for all the investment assignments and stores the value in the EACSUM
field. In addition, this job updates the total actuals for an investment.
You can run this job immediately or on a scheduled basis. Do not set a frequent schedule for the Investment Allocation job. The lists of investments can display the updated investment actuals and ETC in intervals that approach a near real-time appearance of precision. However, for better performance, we recommend that you run the job for reporting on a daily schedule, and not more frequently. This job is database intensive because it aggregates data from multiple tables and updates every investment in the system.
All users can navigate to one of the lists that are based on investment type as a common entry point into the properties of all investments. To optimize the performance of the application, limit the number of fields that you add to these investment list pages.
To view actuals and ETC at the project level, use a custom portlet. To achieve better real-time results, limit the data that you include in the portlet layout.
When you run the Investment Allocation job in between incremental runs of the Rate Matrix Extraction job, the incremental option behaves in the same manner as a full execution. In this case, the Investment Allocation job updates all projects.
Requirements
None
Restrictions
None
Parameters
None
Load Data Warehouse
The Load Data Warehouse Job extracts the data from the 
CA PPM
database, transforms the data to a denormalized format, and loads the data into the Data Warehouse. The frequency of this job determines the freshness of the data within the Data Warehouse. This job populates dimensions, facts, and lookups in the Data Warehouse for stock objects and attributes and any custom objects and attributes that have been explicitly enabled for inclusion in the Data Warehouse. This job also updates the Advanced Reporting domains, if you have set up Advanced Reporting with Jaspersoft. For more information, see
Installing and Upgrading
.
Out-of-the-box, only the most commonly used stock objects and attributes are enabled for inclusion in the Data Warehouse. Enable all custom objects and attributes before running this job to populate them in the Data Warehouse.
This job is in initially disabled. Enable this job before you run it.
The Data Warehouse is not designed to be a real-time data reference. If you need to run reports on live data, we recommend that you create ad hoc views and reports based on 
CA PPM
data sources. See Ad Hoc Views, Reports, and Custom Report Development.
To populate the Data Warehouse with data from the 
CA PPM
database, run the following jobs in the order given:
  • Time Slicing
  • Load Data Warehouse
Requirements
Run this job with the Full Load option selected in the following cases:
  • You perform an upgrade or install a patch on 
    CA PPM
    .
  • You add a new language in the Data Warehouse system options. Running an incremental load of this job does not update the Data Warehouse with the new language.
  • You delete an attribute or unselect the Include in the Data Warehouse option for the attribute in Studio. You cannot re-enable the attribute for inclusion in the Data Warehouse unless at least this job has completed one run.
  • You change the entity for fiscal periods in the Data Warehouse options.
  • You change the timeslice dates to include a larger timeframe.
  • You change the First Day of Work Week.
  • You change any of the settings in the System Options, Data Warehouse Options.
Additionally, run this job each time the data type of an attribute that is included in the Data Warehouse is changed in Studio. First disable the attribute, then run this job. Next, re-enable the attribute with the correct data type, and rerun this job for the changes to take effect.
Running this job with the Full Load option selected adds the following financial plans for applicable investments to the Data Warehouse:
  • The Plan of Record
  • The current budget plan
  • All benefit plans
Restrictions
While the Load Data Warehouse job runs, concurrent user edits may temporarily be excluded from the Data Warehouse.
: The job copies records based on their last_updated_date attribute. The job identifies all the object instances modified
before
the job start date and time. This tip may sound obvious to experienced database administrators; however, users should be made aware that any record modified while the job runs is not going to be populated. A unique situation arises with investments because they appear in two tables: one for
investments
and one for
projects
. If users are making concurrent updates, records in DWH_INV_INVESTMENT may not match in DWH_INV_PROJECT. The next instance of the full or incremental Load Data Warehouse job refreshes the data and corrects this common temporary condition.
Parameters
  • Full Load
Rebuilds the Data Warehouse from scratch if selected. If the option is not selected, this job only looks for incremental changes since the last time you ran it. We recommend that you run the job with the Full Load option selected during off peak hours for better performance. At other times, run this job for incremental loads.
If you are running this job for the first time to populate the Data Warehouse, select the Full Load parameter. Also, if you had a failed run of the Load Data Warehouse job and you corrected the problem, select Full Load the first time that you run this job after the failure. For subsequent runs, you can run an incremental load by leaving the Full Load parameter unselected.
Database Size and Job Frequency
The size of your database determines how frequently you run this job.
  • Full loads
    recreates the Data Warehouse from scratch. You can run this once a day during non-peak hours. For a global enterprise, you may decide to extend the frequency to once a week, or even longer.
  • Incremental loads
    can be run more frequently, and the frequency is determined by the size of your database. The best way to determine the frequency is to run the incremental load and see how long it takes to complete. We recommend that you run incremental every 3-4 hours depending on the size of your database.
Custom Jobs and Processes
This job imports data into the Data Warehouse database based on the last_updated_date field on the object's instances. If you have any custom jobs and processes that are updating the last_updated_date field, we strongly advise you to reschedule those jobs and processes to run at a different time when the Load Data Warehouse job is not running. In addition, set the Load Data Warehouse job to be incompatible with your custom jobs and processes (Administration, Reports and Jobs). If you do not do this, data may not be updated in the Data Warehouse.
Server Time, Timezone, Date, and Time
To help ensure the correct functionality and accuracy of data in
CA PPM
and all jobs, including the Load Data Warehouse Job, verify the following:
  • The server time is the same (preferable, down to the second) on the
    CA PPM
    application server,
    CA PPM
    database server, and Data Warehouse database server.
  • The timezone, date, and time is the same on the
    CA PPM
    application server and database servers in the same environment. Do not have any differences.
This synchronization is necessary because the Load Data Warehouse job imports data into the Data Warehouse database based on the last_updated_date field on the object's instances. If the date and time on the servers do not match, data may not be loaded into the Data Warehouse. For other jobs, if the date and time do not match, the job may not start. Or, the job may start later than expected, leading to inaccurate data.
Load Data Warehouse Access Rights
The job extracts access rights for investments and resources from the
Clarity Project and Portfolio Management (PPM)
database and loads them into the
CA PPM
Data Warehouse.
To populate the Data Warehouse with data from the database, run this job
after
running the Load Data Warehouse job and after running the Time Slicing job. You
do not need
to run this job each time after you run the Load Data Warehouse job.
This job is initially disabled. Enable this job before you run it.
LDAP - Synchronize New and Changed Users Job
This job synchronizes users that were added or modified in the LDAP server with the 
CA PPM
user table.
Requirements
  • Configure LDAP to run this job.
  • Be an authenticated LDAP user to view this report.
Restrictions
This job cannot run concurrently with any other instances of this job.
Parameters
None
LDAP - Synchronize Obsolete Users Job
This job deactivates users who are marked as inactive, or removed from the LDAP server through the 
CA PPM
user table.
To schedule this job, select LDAP as an Available Job filter.
Requirements
Configure LDAP to run this job.
Restrictions
This job cannot run concurrently with other instances of this job.
Parameters
None
Oracle Table Analyze Job
This job refreshes statistics that are used to determine the best path or execution for a query. Analyze statistics under certain circumstances, such as when the schema or the data volume has changed.
Requirements
None
Restrictions
This job cannot run concurrently with other instances of this job.
Parameters
Post Incident Financials Job
This financial processing job posts the transactions of incident effort entries to the general ledger account. Run this job when you want to track the cost of maintaining non-project investments in your organization.
Requirements
Enter effort for the incident.
Restrictions
This job cannot run concurrently with other instances of this job.
Parameters
  • Effort From Date
    Defines the from-date to which effort has been posted for this job to process. Click the Select Date icon to select a specific date, or select a relative date from the drop-down.
  • Effort To Date
    Defines the to-date to which effort has been posted for this job to process. Click the Select Date icon to select a specific date, or select a relative date from the drop-down.
Post Timesheets Job
The Post Timesheets job is a background process that compiles and posts the actual values into the project plan. This job runs at scheduled intervals. It generates the timesheet for the resource assignment that is based on the actual hours approved and the time with finish dates in the past (at least five minutes ago).
This job does the following:
  • Updates the resource assignment and the Transaction Import table with the actual values on the timesheets. The timesheet status changes to "Approved".
  • Advances Estimate To Complete (ETC) past the time period for the posted timesheets on all task assignments of the corresponding resources.
For more information, see
Getting Started
.
Requirements
Set up Timesheets.
Restrictions
This job cannot run concurrently with the following jobs:
Post Timesheets (other instances of this job cannot exist)
  • Delete Investments
Parameters
None
Post Transactions to Financial Job
This job verifies and transfers data from the Transaction Import tables to the Financial Management tables. This data could be the result of posted timesheets, or transactions that are imported from external systems.
Requirements
Set up Financial Management and Timesheets.
Restrictions
This job cannot run concurrently with the following jobs:
  • Post Transactions to Financials (other instances of this job cannot exist)
  • Datamart Extraction
  • Delete Investments
Parameters
  • Transaction From Date
    Defines the from date to which transactions are posted for this job to process. Click the Select Date icon to define a specific date, or select a relative date from the drop-down.
  • Transaction To Date
    Defines the to date to which transactions are posted for this job to process. Click the Select Date icon to define a specific date, or select a relative date from the drop-down.
    If you do not enter the To Date value, this job posts transactions up to the current date.
Purge Documents Job
This job permanently deletes documents.
Requirements
As an administrator, back up all documents that are stored in the product or in the Knowledge Store.
Restrictions
This job cannot run concurrently with other instances of itself.
Parameters
  • Purge All Documents for the Following Objects
  • [Or] Purge Documents and Versions Not Accessed for [n] Days
  • [Or] Retain the [n] Most Recent Versions and Purge the Prior Versions
  • All Projects
  • Project OBS
  • Specific Project
  • All Resources
  • Resource OBS
  • Specific Resource
  • All Companies
  • Company OBS
  • Specific Company
  • Knowledge Store
Purge Audit Trail Job
This job removes all audit trail records according to specified job settings. You can set this job to run immediately or according to a set date and time, and you can run it on a recurring schedule.
Parameters
None
Purge Financial Tables Job
This job permanently deletes all financial transactions for a specified project.
Requirements
  • (Recommended) Back up all financial transactions before running this job.
  • Grant the Financial Maintenance - Financial Management access right.
  • Verify that the project has a status of "Closed" for the transactions you want to purge.
For more information, see
Project Management
.
Restrictions
This job cannot run concurrently with other instances of this job.
Parameters
Investment
Defines the name of the investment on which this job runs.
Purge Temporary Aggregated Data Job
This job cleans up the data that is created as a part of computing aggregated costs for portfolio management. This includes caching the values of aggregated costs and role demand for the investment hierarchy.
The job is scheduled to run once a day automatically, but can be run on demand too. Run this job for multiple changes in the investment hierarchy, or in the properties of multiple investments.
Requirements
None
Restrictions
This job cannot run concurrently with other instances of this job.
Parameters
  • Purging Option
    Indicates if all temporary data or outdated data has been cleared. Clearing all temporary data cleans up all the temporary as well as cached data. Clearing outdated temporary data preserves the cached data that is still valid and deletes other temporary data.
Rate Matrix Extraction Job
This job extracts rate matrix information and populates the rate matrix extraction tables. Run this job each time the rate matrix has changed, when the financial properties of a project have changed. There is no need to run the job when an investment is created or a resource is added to an investment.
You can prepare and update the rate matrix data. While the job is running, you can use the rate matrix data.
Best Practice:
Preparing the rate matrix data takes more time than updating the rate matrix data. To minimize the time period during which the rate matrix data is unavailable, schedule two instances of the job, each with only one of the two parameters selected.
The Rate Matrix Extraction functionality provides rates to the following 
CA PPM
actions, without a need to run the job before performing the action. This is applicable provided an investment exists and resources are already added to the investment. If an investment with resources does not exist, run the job to extract the latest rates.
  • Posting Timesheets
  • Baselining a Project
  • Baselining a Task
  • Updating Earned Value Totals using the user interface button or scheduling the job
  • Updating Earned Value History
  • Opening a project in Open Workbench or Microsoft Project
We recommend scheduling the Rate Matrix Extraction job using the "Incremental Update only" option at intervals that reduces the need for 
CA PPM
to extract the rates in real time for the actions. If the job populates the rates at regular intervals, the performance of the actions improves based on your system data.
Requirements
Set up a rate matrix.
Restrictions
This job cannot run concurrently with other instances of this job. Delete all scheduled instances of this job and reschedule the job using the Prepare Rate Matrix Data parameter or the Update Rate Matrix Data parameter.
Parameters
  • Extract Cost and Rate Information for the Scheduler
    Specifies extracting cost and rate information for a desktop scheduler. This field is a flag that triggers the job to generate resource rates for an investment that include rates prior to the start and after the finish date of the investment.
    Default:
    Cleared
  • Prepare Rate Matrix Data
    Specifies adding the updated rate matrix data to a temporary table. The data in the NBI_PROJ_RES_RATES_AND_COSTS table stays intact. The rate matrix data is available.
    Default:
    Cleared
  • Update Rate Matrix Data
    Specifies copying the updated rate matrix data from the temporary table to the NBI_PROJ_RES_RATES_AND_COSTS table. The rate matrix data is not available.
    Default:
    Cleared
  • Incremental Update Only
    Specifies rates extraction only for currently updated projects. Running the job takes less time when compared with full run.
    Default:
    Selected
Remote Project Sync Job
The job synchronizes projects and teams from 
CA PPM
to epics in VersionOne. The integration allows project managers and business users to manage projects in 
CA PPM
while using the agile planning features of VersionOne. They can plan and track all effort, resources, and time for agile projects using a single system.
Requirements
The VersionOne Connector must be installed.
Restrictions
None.
Parameters
Remote API Implementation
Defines the remote system with which you are exchanging data. For example, select VersionOne to exchange data with VersionOne.
Remote Timesheet Sync Job
The job synchronizes worklogs to epics in VersionOne and creates timesheets in 
CA PPM
. Resources can then review and submit the timesheets from 
CA PPM
. Integrating agile 
CA PPM
projects with VersionOne allows you to track effort in VersionOne for all workitems and view them in 
CA PPM
timesheets.
Requirements
The VersionOne Connector must be installed.
Restrictions
None.
Parameters
Remote API Implementation
Defines the remote system with which you are exchanging data. For example, select VersionOne to exchange data with VersionOne.
Remove Job Logs and Report Library Entries Job
This job removes job log and Report Library entries from databases that are older than a specified number of days.
Requirements
None
Restrictions
You cannot run this job concurrently with any other instance of the Remove Job logs and Report Library Entries job.
Parameters
Report age for delete
Job age for delete
 
Setup and Update Data Used by Reports Job
This job extracts data from the transactional database tables and stores them in easily understood reporting tables. These reporting tables are the foundation for most reports that are delivered with 
CA PPM
and are used for any custom reports.
Requirements
(Recommended) Run this job after the Time Slicing job.
Define daily time slice definitions to include from date at least three months prior to the current date.
Configure the following settings:
Project OBS Mapping
  • Resource OBS Mapping
Restrictions
This job cannot run concurrently with the following jobs:
  • Delete Investments
  • Import Financial Actuals
  • Post Timesheets
  • Post Transactions to Financial
  • Recalculate Cost of Capital Fields
  • Time Slicing
Parameters
None
Synchronize Portfolio Investments
This job synchronizes the portfolio planning data with the latest data from the actual investments. The update is based on a sync schedule that the portfolio manager defines in the portfolio properties. Whenever the job runs based on the sync schedule, the latest data from the actual investments is reflected in the portfolio.
The job copies only those attributes from the actual investments that were registered to display on your portfolio investment pages and views. For more information about viewing the registered attributes or changing the list of registered attributes, see
CA PPM Studio Development
.
Requirements
None
Restrictions
None
Parameters
None
Time Slicing Job
This job processes all configured time slices and updates discrete transactional data for the actual task assignment values, Estimate To Complete (ETC) and baselines, time sheet actuals, team and assignment data from a scenario, resource allocations to projects and resource availability.
Requirements
None
Restrictions
This job cannot run concurrently with other Time Slicing jobs. Other instances of this job cannot exist.
Parameters
None
Tomcat Access Log Import/Analyze Job
This job imports and analyzes Tomcat access log files from the local 
CA PPM
environment (all app services), then stores and summarizes the data in designated tables (LOG_DETAILS, LOG_SUMMARY, LOG_FILES, LOG_REPORTDEFS). With the addition of custom portlets and queries or externally available content, this analysis data can provide details regarding the performance of the 
CA PPM
system. If you are not running a Tomcat application server, the job runs but does not import any data.
Requirements
None
Restrictions
None
Parameters
  • Log Date
    Specifies the date for the access logs that are imported and analyzed. If no date is specified, the default date is yesterday.
Update % Complete Job
The
Update % Complete
job updates the percent (%) complete values whenever you change project or task data that affects the percent complete calculation. This job is only run if the % Complete Calculation Method is set to Duration or Effort. This is a field on the project scheduling properties page. Schedule this job to update the percent complete values automatically.
In addition, the following operations trigger this job to run:
  • Publishing the tentative schedule.
For more information, see
Project Management
.
  • Posting actuals and distributing them to the project plan.
For more information, see
Project Management
.
This job is scheduled to run automatically every 30 minutes, but you can also run this job on demand.
Best Practice:
Schedule this job to run recurrently at an appropriate interval, for example, every 10 minutes.
Requirements
None
Restrictions
None
Parameters
None
Update Aggregated Data Job
Use the Update Aggregated Data job to flatten the percentage allocations between investments. Run this job for multiple changes in investment data.
This job is scheduled to run automatically every 10 minutes, but you can also run this job on demand. Do not decrease the frequency of this job, as many objects depend on this job to get the flattened view of percentage allocation between the investments.
Run this job successfully before running the Generate Invoices job If using Chargebacks. Run this job against the current planned cost data if using Portfolio Management. This job is required to show the planned cost and budget data for all portfolios and included investments.
To improve the performance of this job and to avoid database contention, make the following jobs incompatible:
  • Datamart Rollup - Time Facts and Time Summary Job
  • Datamart Extraction Job
  • Rate Matrix Extraction Job
  • Oracle Table Analyze Job
  • Time Slicing Job
Requirements
None
Restrictions
This job cannot run concurrently with any other instance of itself nor the Generate Invoices job.
Parameters
None
Update Allocation from Estimates Job
The Update Allocation from Estimates Job updates team allocation to match remaining ETC starting from team member ActThrough date.
Best Practice:
This job processes team and task data and so run it during off-peak hours. Scheduling this job during normal work hours can affect the overall system performance.
Restrictions
This job is available only for projects. The job updates the allocation values only for active projects and ignores all inactive projects.
Parameters
  • Project Name
    Indicates the active project to which the user has access rights to edit. The job processes the selected project.
  • Manager:
    Indicates the project/investment manager to which the user has access rights to view resource. The job processes all active projects which are assigned to the selected manager.
  • Investment OBS:
    Indicates the Investment OBS Unit to which the user has access rights to view OBS. The job processes all active projects that are associated with the selected OBS unit, based on the OBS Mode setting.
  • OBS Mode:
    Determines which branches of the selected OBS structure are processed when selecting investments using an OBS Unit.
    Values:
    Unit Only, Unit and Ancestors, Unit and Descendants
    Default:
    Unit Only.
Error Handling
The user that schedules these jobs may not have the appropriate security rights to all the resources on the investments.
As a
best practice
when handling resource staffing decisions, always have the appropriate manager who has the appropriate access rights make allocation and ETCs decisions for their resources.
If the scheduler of the job does not have the appropriate access rights, the following action happens:
  • The job proceeds through all resources on the designated investments.
  • The job logs the error in the jobs log file and includes the following information:
    • Investment Name
    • Team Member Name
    • Error Message
    The log file, App-niku-xbl.log is located at <Clarity_Runtime>\logs
Update Business Objects Report Tables Job
The Update Business Objects Report Tables job is required for reports that display any of the following: monthly or weekly calendar periods; FTE amounts; and OBS level or phase grouping options. This job populates reporting tables that are based on the parameters that are selected when running the job. If these tables are not populated, reports dependent upon them display a 'No Data Found' message. This job should be scheduled to run nightly to keep the reporting tables up to date.
When running this job, if you receive an error message indicating that "Entity has not been setup in Data Warehouse Settings", then complete the following steps to resolve the issue:
  1. From Administration, Finance, Setup, verify that an entity is set up.
  2. If no entities exist, add an entity and at least 1 monthly fiscal period that covers the range of dates you need for daily time slices.
  3. From Administration, General Settings, System Options, add an Entity for Fiscal Periods in the Data Warehouse Options section.
For details about creating an entity, see Set Up a Financial Entity.
Requirements
None
Parameters
The following are the available parameters for this job, each one populating a different table:
  • Update Reporting Calendar
    This parameter populates the calendar table (rpt_calendar) that stores date ranges for daily, weekly, monthly and quarterly calendar periods, as well as the FTE for the date range. The start day of the weekly periods is determined by the First Day of Work Week field set in 
    CA PPM
    (Administration, Project Management, Settings). When this job option is run, it will populate the table five years back and five years forward, based on the current date. For example, if you run the job in October of 2012, the table will be populated with periods from October 2007 through October 2017. Most of the reports displaying data by calendar period reference this table and will display a 'No Data Found' message if this table is not populated. This job option should be run at least once a month. It should also be run if the availability of the resource with Resource ID of 'admin' changes, because this resource’s calendar is the calendar that determines the FTE calculation.
  • Update Investment Hierarchy
    This parameter populates the investment hierarchy table (rpt_inv_hierarchy) that stores up to ten levels of investment hierarchical relationships and hierarchy allocation percentages. This investment hierarchy table is being used by reports in the solution pack. This parameter also populates the program hierarchy table (rpt_program_hierarchy) that stores up to five levels of program and project hierarchical relationships. This program hierarchy table is not being used by any reports in the solution pack.
  • Update WBS Index
    This parameter populates the WBS index table (rpt_wbsindex) that stores relationships between phases and tasks. This job option does an incremental update so it can be scheduled to run frequently (e.g., once an hour) if necessary.
  • Update Resource Skills Index
    This parameter populates the resource skills tables (rpt_res_skills_index and rpt_res_skills_flat) that store relationships between resource skills and their parent skills. The job supports up to ten levels in the skills hierarchy.
  • Update OBS
    This parameter populates the data mart OBS table (nbi_dim_obs) that stores OBS unit information up to ten levels. This table is used in some reports for grouping by OBS level. This job option should be running on your system nightly, or if there are changes to the OBS structure. The Datamart Extraction job also populates this table.
Update Earned Value History Job
The Update Earned Value History job calculates earned value for a project or set of projects and creates earned value snapshots of the time sliced data. This data is based on the earned value reporting period that is assigned to the project and the parameters that you select. The earned value snapshot is used for historical earned value analysis (EVA) and reporting. The snapshots are stored in rows in the PRJ_EV_HISTORY (earned value history) table. You can use this reporting data to write reports.
This job invokes the Update % Complete job before it runs and completes. This job runs on a recurring schedule that is based on how often your organization reports on your earned value data. This job uses the lag value to determine the day to take the snapshot. A snapshot is taken on the first day following the defined lag, providing you do not run the job on that day.
Example: Monthly with Three Day Lag
If you schedule this job to run monthly starting 2/1/11 with a lag of three days and you have associated the project to an earned value reporting period whose period type is defined as Monthly and frequency is the first day of the month, a snapshot for January 2011 is generated only when the job runs on 2/04/11 or later.
For each project that meets the job parameter criteria, this job:
  • Finds the project associated earned value reporting period and saves the project tasks earned value data based on that period.
  • Locks the project Earned Value Reporting Period field.
For more information, see
Project Management
.
Requirements
To create a historical snapshot, the project:
  • Be associated to an earned value reporting period.
  • Have a current baseline.
Restrictions
This job cannot run concurrently with any other instance of the
U
pdate Earned Value History job.
Parameters
The following parameters are provided:
If you do not complete any of the parameters, all of the projects are processed.
Investment
Defines the name of the investment on which this job runs.
  • OBS Unit
    Defines the name of the OBS Unit for the project on which this job runs.
  • Investment Manager
    Specifies the name of the resource managing the project.
  • Lag
    Determines the number of days to wait before taking the snapshot. Use this setting to defer taking a historical snapshot so that your organization can reconcile the actual values from one system to another.
  • Rewrite Existing Snapshot
    Indicates for the job to regenerate the current reporting period snapshot and replace the existing current periodic snapshot with updated data. When cleared, the projects having periodic snapshots are ignored.
    Default:
    Cleared
  • Show Projected ACWP
    Indicates for the job to create data for the projected actual cost of work performed (ACWP) of all level-1 tasks in the work breakdown structure (WBS) for a project.
    Default:
    Cleared (disabled)
  • Show Projected BCWP
    Indicates whether you want this job to create data for the projected budgeted cost of work performed (BCWP) of all level-1 tasks in the work breakdown structure (WBS) for a project.
    Default:
    Cleared (disabled)
  • Show Projected BCWS
    Indicates for the job to create data for the projected budgeted cost of work scheduled (BCWS) as of the date for projects and project tasks.
    Default:
    Cleared (disabled)
Update Earned Value and Cost Totals Job
This job calculates the earned value and costs for projects and calculates the costs for all investment types. You can select the project or the investment using the Investment browse field on the Job Properties page.
The Update Earned Value and Cost Totals job tracks investment progress by calculating earned value and updating costs. This job invokes the Update % Complete job before it runs. This job calculates and records the current earned value totals through the current date for one or more investments. The data is stored in a reserved row in the PRJ_EV_HISTORY (earned value history) table. The saved current earned value data totals appear in fields on investments and tasks.
This job is scheduled to run regularly. You can schedule this job to run in the background. You can invoke this job on demand for a project by selecting Update Cost Totals from the Actions drop-down menu. For more information, see 
Project Management
.
Restrictions
This job cannot run concurrently with any other instance of the Update Earned Value and Cost Totals job.
Parameters
The following parameters are provided:
  • Investment: 
    Defines the name of the investment on which this job runs.
  • OBS Unit: 
    Defines the name of the OBS Unit for the project on which this job runs.
  • Investment Manager: 
    Specifies the name of the resource managing the investment.
: If you do not enter your own parameter values, this job processes all investments including NPIOs. A large number of team members on an investment can impact job performance. We recommend that you schedule the report using parameters to selectively update smaller sets of investments.
Update Estimates from Allocations Job
Updates the Effort Task ETC to match Allocation starting from team member Act through date. This job can be run for both NPIOs and Projects.
Best Practice:
This job processes team and task data and so run it during off-peak hours. Scheduling this job during normal work hours can affect the overall system performance.
Restrictions
Projects without an Effort Task are not processed.
Parameters
  • Investment
    Indicates the active investment to which the user has access rights to edit. The job processes the selected investment.
    Values:
  • Manager
    Indicates the project or investment manager to which the user has access rights to view resource. The job processes all active investments which are assigned to the selected manager.
  • Investment OBS
    Indicates the investment OBS Unit to which the user has access rights to view OBS. The job processes all active investments that are associated with the selected OBS unit, based on the OBS Mode setting.
  • OBS Mode
    Determines which branches of the selected OBS structure are processed when selecting investments using an OBS Unit.
    Values:
    Unit Only, Unit and Ancestors, Unit and Descendants
    Default:
    Unit Only.
  • Investment Type:
    Indicates the investment type: Applications, Assets, Ideas, Other Work, Portfolios, Programs, Services, Projects, and Products.
The job updates the ETC values only for active investments and ignores all inactive investments.
Error Handling
The user that schedules these jobs may not have the appropriate security rights to all the resources on the investments.
When handling staffing decisions, we recommend that the appropriate manager (with the appropriate access rights) makes the allocation and ETC decisions for their resources.
If the scheduler of the job does not have the appropriate access rights, the following action happens:
  • The job proceeds through all resources on the designated investments.
  • The job logs the error in the jobs log file and includes the following information:
    • Investment Name
    • Team Member Name
    • Error Message
    The log file, App-niku-xbl.log is located at <Clarity_Runtime>\logs
Validate Process Definitions Job
This job checks for the integrity of a process. For example, if a sub process called by the process is active, or if a step action condition is valid. This job can be useful when you use process definition XOG to import many process definitions. All process definitions that are imported into the target system are not validated and in draft mode. You can then run this job to batch validate and activate process definitions.
In various cases, certain processes can be invalidated. For example, during ODF object deletion, object attribute deletion, or process deactivation. You can schedule this job regularly to validate the process definitions.
Restrictions
None
Requirements
Validate process definitions that the login user has Process Definition - View access rights. Optionally activate the process definitions when they validate.
Parameters
  • Activate Process If Validated
    When enabled and Process Status is Validated, the job automatically activates the process definitions.
    Default:
    Cleared
  • Process Status
    Specifies the status of process definitions that are imported into the target system.
    Values:
    • Errors Encountered
    • Not Validated
    • Re-validation Required
    • Validated