CA PPM Jobs Reference

cappm includes stock jobs that you can configure, schedule, and run. As an administrator, use the following list of jobs, descriptions, and parameters for running jobs successfully.
ccppmop1541
HID_ref_jobs_reference
 
Clarity PPM
 includes stock jobs that you can configure, schedule, and run. As an administrator, use the following list of jobs, descriptions, and parameters for running jobs successfully.
 
 
2
 
 
Video: Data Extraction
The following video is provided by CA Technologies.
 

 
To play this video in full screen, click the YouTube logo to the right of Settings at the bottom of the video. 
Anonymize Personally Identifiable Information (PII)
This job permanently obfuscates attribute values that contain personal data for organizations with GDPR or other privacy policies. After running the job, the following attributes for designated resources are scrambled with serialized data:
  • Resource Full Name
  • Resource First Name
  • Resource Last Name
  • Resource E-mail
Custom date, string, number, and money attributes are also scrambled.
Parameters
  •  
    Anonymize Only Inactive Resources with Anonymize PII Selected: 
    Select this option to scramble the data for the resources flagged with the 
    Anonymize Personally Identifiable Information (PII)
     option on the 
    Resource Settings
     page. 
  •  
    WARNING! Anonymize ALL Resources: 
    Select this option to permanently obfuscate the resource data indicated in the job description for all internal resources.
Autoschedule Project Job
This job creates or overwrites tentative schedules by automatically scheduling tasks that are based on task dependencies, constraints, priority order, and related date and resource logic. This job can be run concurrently with other instances of this job.
Parameters
  •  
    Project: 
    Defines the name of the investment to schedule.
  •  
    OBS Unit: 
    Defines the specific OBS unit to schedule. All projects that are contained in the OBS unit and for which you have the Project - View Management right are auto scheduled. If both a project and an OBS unit are selected, the project is ignored and the OBS unit is used.
  •  
    Autoschedule Date: 
    Defines the date from which to schedule tasks.
  •  
    Ignore Tasks Starting Before: 
    Defines the date before which tasks are ignored during auto scheduling.
  •  
    Ignore Tasks Starting After: 
    Defines the date to begin excluding all tasks from the schedule.
  •  
    Resource Constraints: 
    Indicates if this job considers resource availability during auto scheduling.
    Default:
     Selected
  •  
    Schedule from Finish Date: 
    Indicates scheduling the job backwards from the finish to the start date, rather than from the start date to the finish date.
    Default:
     Cleared
  •  
    Subnets: 
    Indicates calculating the critical path for the entire project during auto schedule.
    Default:
     Cleared
  •  
    Honor Constraints on Started Tasks: 
    Indicates for the job to schedule the remaining work on this task according to its normal auto schedule logic, including any task constraints and dependencies.
    Default:
     Cleared
  •  
    Schedule Assignments on Excluded Tasks: 
    Indicates changing the task assignment dates for the job, as long as the new dates stay within the existing start and finish dates for the task.
    Default:
     Cleared
  •  
    Start Successors on Next Day: 
    Indicates starting all successor tasks for the job on the next day.
    Default:
     Cleared
  •  
    Publish After Scheduling: 
    Indicates if changes made to the tentative plan are automatically published to the Plan of Record at the end of auto scheduling. When selected, the tentative plan is deleted and the project is unlocked.
    Default:
     Cleared
Clean User Session Job
This job removes the session-based user data stored in the product for the resource logged in. The criteria for removing the data is the creation date/time of the data, with the length of the session expiration time must be before the date/time this job runs. User data contains references to the resource logged in to the product and any session-based data, such as Shopping Carts and Search Results that have persisted.
 
Requirements
 
None
 
Restrictions
 
This job cannot run concurrently with other jobs including other instances of this job.
 
Parameters
 
None
Convert Mixed Booking Status Job
This job converts all unlocked investment team allocations with 
mixed
 booking status to 
hard
 or 
soft
. This job also disables the Allow Mixed Booking option. Updates are made for both resource and role allocations for active and inactive investments.
: To view the 
Allow Mixed Booking
 check box, click 
Administration, Project Management, Settings
.
Use this job if you have activated the 
New User Experience
 and are using the new Staffing (Resource Management) features. The new features do not support mixed booking allocations. The job is inactive by default. As an administrator, activate and run this job so users can use the new features.
The job log provides information about the total records that are updated and the total remaining unprocessed records.
The mixed booking allocations are left unprocessed in the following cases:
  • A scheduler has locked the mixed booking allocations for a project.
  • The user selects the target booking status as Mixed instead of Soft or Hard.
Because the mixed allocations are remaining in these cases, the Allow Mixed Booking option is also not disabled.
Requirements
None
Restrictions
None
Parameters
  •  
    Target booking status
    Specifies the booking status to which to convert the current mixed booking allocations. The following options are available:
    • Soft. Any planned allocation on a project remains unchanged and the hard allocation is removed. For example, if the planned allocation is 65 percent and the hard allocation is 100 percent, after all the mixed allocation status is converted to Soft, the planned allocation remains at 65 percent and the hard allocation is removed. We recommend that you select this option.
    • Hard. Any planned allocation on a project remains unchanged and the hard allocation is updated to match the planned allocation. For example, if the planned allocation is 75 percent and the hard allocation is 100 percent, after all the mixed allocation status is converted to Hard, both the planned and hard allocation become the same at 75 percent.
    • Mixed. The mixed booking allocations are left unprocessed. We recommend that you do not select this option.
Copy Cost Plan of Record Charge Code with Cost Type Job
This job creates a copy of a plan of record and adds Cost Type to the existing grouping attributes. This job is intended to process only investments that were created before Release 13.2 or that have been enabled for capitalization.
This job completes the following tasks:
  • The job processes only plans of record that have Charge Code as one of the grouping attributes. The job copies the current plan of record and adds Cost Type to an existing set of grouping attributes. The newly added Cost Type grouping attribute is set to either Capital or Operating on every line. How the attribute is set depends on the charge codes you designate as Capital or Operating in the job parameters. If the charge code for a line is selected in the Capital Charge Code mapping, the cost type of the line is set to Capital. Otherwise, in all cases (including a null charge code), the cost type for the line is set to Operating.
  • The job creates a cost plan with the following name and ID:
    • The new cost plan is named 
      System Converted Cost Plan
      .
    • The new cost plan ID is created using the following format:
      COST_PLAN_CONV_<
      Year-Month-Day (####-##-##)
      >.
      For example: COST_PLAN_CONV_2013-08-15.
  • If the check box named 
    Set New Cost Plan as Plan of Record
     is selected, the new cost plan becomes the plan of record.
  • When the 
    Copy Latest Approved Budget Plan
     check box is selected, the following rule applies: The current approved budget plan is copied only if it has a charge code selected as a grouping attribute. The newly created budget plan is marked as the approved and the current budget plan. If there is no approved budget (only a submitted budget plan), the job does not copy the submitted budget. If there are many approved budget plans and a submitted budget, the job copies the current approved budget plan and marks the newly created plan as the current approved budget plan.
    The rules to set the value of newly added Cost Type grouping attribute are same as explained previously for the newly added cost plan.
    The job creates a budget plan with the following name and ID:
    • The new budget plan is named System Converted Budget Plan.
    • The budget plan ID is created using the following format: BUDGET_PLAN_CONV_<Year-Month-day>.
     The newly copied budget is set as the current budget. The newly copied cost plan is made the plan of record only if the 
    Set New Cost Plan as Plan of Record
     check box is selected.
  • If no failure occurs during the processing of an investment plan, the job marks the investment as successfully processed. Successfully processed investments are skipped in subsequent executions of the job even if these investments are selected again through an OBS Unit. Successfully processed investments stop appearing in the Investment browse on the job input screen. Therefore, it is important to select the correct options before submitting investments to the job.
This job updates financial plans for investments so that capital cost appears. When you perform an upgrade, the system builds an initial list of investments registered for the Enable Capitalization job. To enable charge codes from new investments for capitalization, run the 
Register New Investments for Enable Capitalization
 job. You can run the Copy Cost Plan... job as an ongoing function for bulk updates. You can deactivate the job after you have enabled investments for capitalization and have copied their plans of record. 
 The statistics of a job execution are printed in a BG log file for the job. You can open and read the log file. The log file contains information such as the number of processed, skipped, or failed investments.
 
Requirements
 
The 
Enable Capitalization
 job must be run on the investment before you can select the investment for this job. You can also run the 
Register New Investments for Enable Capitalization
 job.
 
Restrictions
 
If your investment cost plans contain a large amount of detail, process a few investments at a time.
 
Parameters
 
  •  
    OBS Unit: 
    Specifies the OBS whose investments you want to process. The job processes all investments for the OBS and its descendants. Use this option if you do not want to select individual investments for processing.
  •  
    Investment: 
    Specifies the individual investments that you want to process.
  •  
    Capital Charge Code: 
    Specifies the charge codes that you want to designate as the cost type Capital. If you do not indicate a value for a specific charge code, the default cost type is applied.
  •  
    Operating Charge Code: 
    Specifies the charge codes that you want to designate as operating cost. If you do not indicate a value for a specific charge code, the default cost type is applied.
Create and Update Jaspersoft Users Job
To improve performance, the Create and Update Jaspersoft Users job does not create user-specific folders under the Users folder in Jaspersoft. At one point it did; however, this behavior ended in 14.2.0.8, 14.3.0.6, 14.4, and all higher releases.
For example, as an administrator, you create a resource in CA PPM and provide the resource with Advanced Reporting access rights. The new resource does not automatically get a folder under the Users folder with the folder name matching the CA PPM resource ID. However, the Jaspersoft administrator can create user-specific folders under the Users folder in Jaspersoft. The PPM administrator must always provide administrative rights to those resources for their user-specific folders so they can manage their folders. Ideally, only the specific Advanced Reporting user and the Jaspersoft administrator need access to the user-specific folders.
After a PPM upgrade, the content of the user-specific folders remains intact. However, after an upgrade, this job deletes any empty folders. The Full Sync option deletes any empty user-specific folders under the Users folder, including any folders that might have been created by earlier releases of CA PPM. The empty user-specific folders are deleted only if the folder names correspond to CA PPM resource IDs. The user-specific folders are not deleted if the Jaspersoft administrator follows any other naming convention.
Run this job after creating or updating 
Clarity PPM
 users with Advanced Reporting access rights to synchronize them with Jaspersoft. This job performs the following tasks:
  • Creates Advanced Reporting users if they do not already exist in Jaspersoft. 
  • Passes and updates user properties from 
    Clarity PPM
     to Jaspersoft.
  • Updates license types in 
    Clarity PPM
     and assigns Jaspersoft Roles based on assigned Advanced Reporting access rights.
  • Disables users in Jaspersoft if they are inactive or locked in 
    Clarity PPM
  • If a Jaspersoft superuser disables or deletes a valid 
    Clarity PPM
     user, this job makes the user active again in Jaspersoft.
: Schedule this job to execute nightly (off peak hours). Select the Include Inactive and Locked Users checkbox, to synchronize all the Jaspersoft users-active, inactive and locked. If this option is not selected, only CA PPM active users are synchronized with corresponding Jaspersoft users. Run this job before you run the Synchronize Jaspersoft Roles job.
: The Jaspersoft User ID field is mapped to the User Name field in 
Clarity PPM
. In certain situations, such as when the User Name in 
Clarity PPM
 has special characters, or if you change the User Name in 
Clarity PPM
, the User Name value in 
Clarity PPM
 and the User ID value in Jaspersoft do not match. For example, a 
Clarity PPM
 user name of  populates the User ID field in Jaspersoft as 
user_company_com
. In another example, the 
Clarity PPM
 User Name changes from 
userABC
 to 
user XYZ
. In this example, the User ID field in Jaspersoft is populated as 
userABC
. This is the
 
first User Name value you specified in 
Clarity PPM
. If you search in Jaspersoft for the 
Clarity PPM
 user (
user XYZ)
, you will not find the user. To resolve this situation, go back and change the User Name in 
Clarity PPM
 to the original name. If this is not possible, search in Jaspersoft on another field, such as the User name. When you do this, you can find the 
Clarity PPM
 user in Jaspersoft.
Create Data Warehouse Trend Job
This job creates new snapshots of your trending data.
 
Requirements
 
Run the Load Data Warehouse job before running this job.
 
Parameters
 
 
Trend Type
: Defines the type of snapshots to take. Currently there are 3
  • All Trend Tables: Snapshots all tables to a trend
  • Monthly Trend Tables: Snapshots tables with trend type of Monthly or All
  • Summary Trend Tables: Snapshots tables with trend type of Summary or All
  • Fiscal Trend Tables: Trend Tables with trend type of Fiscal or All
 
Trend Period
: Use this field to populate a year of periods or a single period of data for fiscal and monthly trend types The field allows you to set up a schedule without having to change the trend period every month.
 
 
Trend Year
: If the Trend Type is set to Monthly Trend Tables, then a snapshot of those tables are taken for that year. The year is required for Monthly Trend Tables and will default to the current year if it’s not filled in
 
Trend Name
: Use this field to name the trend. If filled in, this will be the trend name, otherwise, the trend name defaults to YYYY-MM-DD:[Trend_key]. For example, 2017-04-10:5000001.
Datamart Extraction Job
This job extracts data from the transaction database tables and stores it in the legacy datamart reporting tables. These tables were the foundation for most stock reports and were used for any legacy custom reports.
  • NBI_R_FACTS
  • NBI_RUN_LOGS
  • NBI_ROLLUP_SQL
  • NBI_RESOURCE_TIME_SUMMARY
  • NBI_RESOURCE_CURRENT_FACTS
  • NBI_PRT_FACTS
  • NBI_PRTF_FM
  • NBI_PROJ_RES_RATES_AND_COSTSCP
  • NBI_PROJECT_FORECAST
  • NBI_PROJECT_CURRENT_FACTS
  • NBI_PM_PROJECT_TIME_SUMMARY
  • NBI_FM_PROJECT_TIME_SUMMARY
  • NBI_EVENTS
  • NBI_DIM_OBS_FLAT
  • NBI_DIM_OBS
  • NBI_DIM_CALENDAR_TIME
  • NBI_DIM_FISCAL_TIME
 For Microsoft SQL Server with the SQL Server Agent enabled, add the 
Clarity PPM
 administrator account to the SQLAgentUserRole role to run datamart extraction jobs. See the Microsoft SQL Enterprise Manager documentation for the details about adding user accounts to the SQLAgentUserRole role.
 
Requirements
 
  • (Recommended) Run this job after the Time Slicing job.
  • Define daily time slice definitions with a start date at least three months before the current date.
  • (Optional) Set up datamart stoplights.
  • Configure the following Datamart settings:
    • Datamart Currency
    • Datamart Entity
    • Datamart Extraction Options
    • Project OBS Mapping
    • Resource OBS Mapping
 
Restrictions
 
This job cannot run concurrently with the following jobs:
  • Datamart Extraction
    Other instances of this job cannot exist.
  • Datamart Rollup - Time Facts and Time Summary
  • Delete Investments
  • Import Financial Actuals
  • Post Timesheets
  • Post Transactions to Financial
  • Recalculate Cost of Capital Fields
  • Time Slicing
Parameters
  •  
    Extract Only OBS
    Specifies that the job extracts only the OBS data if this option is selected.
  •  
    Extract Only Calendars
    Specifies that the job extracts only the calendar data if this option is selected.
:
  • To refresh NBI_DIM_OBS and NBI_DIM_OBS_FLAT, select 
    Extract Only OBS
    .
  • To refresh NBI_DIM_CALENDAR_TIME and NBI_DIM_FISCAL_TIME, select 
    Extract Only Calendars
    .
  • To refresh NBI_DIM_OBS, NBI_DIM_OBS_FLAT, NBI_DIM_CALENDAR_TIME, and NBI_DIM_FISCAL_TIME, select both check boxes.
  • To refresh all tables, clear both check boxes.
: By default, the parameters are unchecked. The job executes completely and generates all datamart table data. After a full and successful execution of the job, the entity and currency fields on the Datamart Settings page under Administration are locked.
Datamart Rollup - Time Facts and Time Summary Job
This optional job populates the following time facts and time summary tables for resources who want to develop custom reports:
  • NBI_PM_PT_FACTS
  • NBI_FM_PT_FACTS
  • NBI_RT_FACTS
  • NBI_PM_PROJECT_TIME_SUMMARY
  • NBI_FM_PROJECT_TIME_SUMMARY
  • NBI_RESOURCE_TIME_SUMMARY
The populated data is not used in standard 
Clarity PPM
 reports.
 
Requirements
 
  • (Recommended) Run this job after generating detailed facts from the Datamart Extraction job.
  • To generate time facts and time summary data, complete the following steps:
    1. From Administration, select Datamart Settings under Data Administration.
    2. In the Customization of Datamart Extraction section, select at least one of the check boxes.
 
Restrictions
 
Multiple instances of this job cannot run concurrently. This job cannot run with the following jobs:
  • Datamart Rollup - Time Facts and Time Summary
  • Datamart Extraction
  • Delete Investments
 
Parameters
 
None
Delete Investments Job
This job permanently deletes investments-projects, programs, assets, and other work-and their associated data including investment hierarchy, financial data, tasks, timesheets, documents, and time periods when marked for deletion.
 Change the project track mode from 
PPM
 to 
Other
 if the assignments have actuals. Open the project and select Properties, Settings. If you do not change the track mode, the project is not deleted.
 
Requirements
 
Additional access right includes the <Investment> - Delete access right.
The project meets the following conditions:
  • Project is inactive
  • Project marked for deletion
  • Cannot contain time entries
(Recommended) Back up all projects before running this job.
 
Restrictions
 
This job cannot run concurrently with the following jobs:
  • Delete Investments (other instances of this job cannot exist)
  • Import Financial Actuals
  • Post Timesheets
  • Post Transactions to Financial
  • Time Slicing
 
Parameters
 
None
Delete Data Warehouse Trend Job
This job deletes previous snapshots of your trending data. You can specify the removal of a trend or all snapshots before a specific or relative date.
 
Requirements
 
None.
 
Parameters
 
 
Trend
: Select one trend. All its snapshots and trending data are deleted.
 
Delete Trends Created Prior To
: If a Specific Date or Relative Date is chosen, then all trends and their snapshots are deleted where the Trend Start Date is before or equal to the date.
Delete Log Analysis Data Job
This job removes the 
Clarity PPM
 log analysis-related data. The criteria for removing the data is the LOG_DATE field on each of the log analysis tables.
This job is scheduled automatically to run at 1:00 AM each day.
 
Requirements
 
None
 
Restrictions
 
None
 
Parameters
 
  •  
    Log retention in days
    Specifies the number of days that data is retained in the tables that are related to analyzed access logs. The default value for this parameter is 30 days.
  •  
    Session token retention in days
    Specifies the number of days that data is retained in the table LOG_SESSIONS. The data specifically stores a mapping of the 
    Clarity PPM
     session token to CMN_SEC_USERS.ID for analysis and audit purposes. The default value for this parameter is 14 days.
Delete Process Instance Job
This job deletes process instances with a status of 
Done
 or 
Aborted
.
 
Requirements
 
Run the Delete Process Instance job before you upgrade. Without performing this step, a series of full scans on the BPM_ERRORS table can slow performance. For every query, the database flushes all buffer data to disk in order to load and read new rows. Run this job with date parameters to improve performance. For example, run in monthly time segments. 
This job removes stale data from the following tables:
CAL_ACTION_ITEM_ASSIGNEES BPM_RUN_PROCESSES CAL_ACTION_ITEMS BPM_RUN_ASSIGNEE_NOTES BPM_RUN_REPLACE_ASSIGNEES BPM_RUN_ASSIGNEES BPM_RUN_STEP_TRANSITIONS BPM_RUN_STEP_COND_RESULTS BPM_RUN_STEP_COND_OBJECTS BPM_RUN_STEP_ACTION_RESULTS BPM_RUN_OBJECTS BPM_RUN_STEPS ESC_RUN_USERS ESC_RUN_LEVELS ESC_RUN_UNREG_OBJECTS BPM_RUN_OBJECTS BPM_RUN_THREADS BPM_ERRORS BPM_ERROR_ARGUMENTS BPM_RUN_PROCESSES
 
Restrictions
 
This job cannot run concurrently with other instances of this job.
 
Parameters
 
  •  
    Process Name: 
    Defines the name of process instance to delete.
  •  
    Process Instance Status: 
    Defines the status of process instance you want to delete. Select a status from the drop-down.
    Values:
    • Aborted
    • Done
  •  
    Finish Date From: 
    Defines the date from which all completed process instances within the selected date range are deleted. Specify one of the following dates:
    • Specific Date: Enter a date or use the Calendar tool to select the date.
    • Relative Date: Select the appropriate relative date from the drop-down.
  •  
    Finish Data To: 
    Defines the date to which all process instances within the selected data range are deleted. Specify one of the following dates:
    • Specific Date: Enter a date or use the Calendar tool to select the date.
    • Relative Date: Select the appropriate relative date from the drop-down. 
      Default
      : Start of Previous Quarter
  •  
    Object Type: 
    Defines the object type of process instance you want to delete, such as a project name. Enter the name of the object to process, or use the search tool to select an object.
  •  
    Initiated By: 
    Defines the name of the user who initiated the process instance you want to delete. Enter the resource name or use the search tool to select a resource. 
  •  
    Initiator OBS
    : Enter the OBS name or use the search tool to select the OBS name or OBS unit.
Enable Capitalization Job
This job enables you to set expenses in investments that were created before Release 13.2 as either Capital or Operating. Before Release 13.2, all expenses were operating expenses by default. The job sets the cost type as either Capital or Operating on the following items:
  • Investments
  • Investment tasks
  • Investment transactions (actual posted WIP transactions)
After you run this job, run the 
Copy Cost Plan of Record Charge Code with Cost Type 
job to complete the capitalization feature setup.
During upgrade, the job takes into account the initial list of investments. To enables charge codes for capitalization for new investments post-upgrade, use the 
Register New Investments for Enable Capitalization
 job.
 
What the Job Does
 
  • Sets the 
    investment
     cost type to either Capital or Operating.
    If the charge code of the specified investment is selected in the Capital Charge Codes browse field on the job screen, the cost type of that investment is set to Capital. Otherwise, if the charge code of the specified investment is selected in the Operating Charge Code browse field or if the charge code is not chosen in either of the browses, the cost type of that investment is set to Operating.
  • Sets the 
    task 
    cost type to Capital or Operating for each task according to charge codes.
    If the charge code of the specified task is selected in the Capital Charge Codes browse field on the job screen, the cost type of that task is set to Capital. Otherwise, if the charge code of the specified task is selected in the Operating Charge Code browse field or if the charge code is not chosen in either of the browses, the cost type of that task is set to Operating.
  • Sets the 
    transaction 
    cost type for actual posted WIP transactions to Capital or Operating where appropriate.
    If the charge code of a transaction for a selected investment maps to a capital charge code, the cost type of that transaction is set to Capital. If the charge code of a transaction for a selected investment maps to an operating charge code or if the charge code is not chosen, the cost type of that transaction is set to Operating.
     During an upgrade, all transactions are set to the 
    Operating
     cost type by default.
  • Sets the 
    team 
    Capitalization % value to the value you enter in the Capitalization Percent parameter.
    This value is used to calculate the percentage of operating cost and capital cost for team allocations.
 
Example
 
The following table shows how the job processes two investments and the cost types that are assigned to tasks and transactions. In the job parameters, the following selections are specified.
 
Investments
: eCommerce Portal Implementation, CRM Enhancement
 
Capital Charge Codes
: eCom, Engineering, CRM
 
Operating Charge Codes
: Operations, Maintenance
 
Investment
 
 
Charge Code
 
 
Cost Type Assigned
 
 
How the Cost Type Was Assigned
 
eCommerce Portal Implementation
eCom
Capital
Tasks
Development
Engineering
Capital
The Development task has a charge code of Engineering which is designated as a Capital charge code; therefore, the task is assigned a Capital cost type.
Training
Travel Expense
Operating
The Travel Expense charge code was not selected in either the Capital or Operating Charge Code parameter field; therefore, the Training task receives the default cost type of Operating.
Miscellaneous
Null
Null
The cost type for the Miscellaneous task is NULL because charge code is NULL on the task.
Trans-
actions
Software Purchase
Engineering
Capital
This transaction has a charge code of Engineering which is a Capital charge code.
Hardware Purchase
Null
Operating
The charge code was not selected in any charge code fields; therefore the job does not change the value of the cost type.
Attended Conference
Travel Expense
Operating
The charge code was not selected in any charge code fields; therefore, the job does not change the value of the cost type.
CRM Enhancement
CRM
Capital
Tasks
Resolve bugs
Maintenance
Operating
The Resolve Bugs task has a charge code of Maintenance which is designated as an Operating charge code; therefore, the task is assigned an Operating cost type.
Help Desk
Operations
Operating
The Help Desk task has a charge code of Operations which is designated as an Operating charge code; therefore, the task is assigned an Operating cost type.
 The statistics of a job execution are printed in a BG log file for the job. You can open and read the log file. The log file contains information such as the number of investments that were processed, skipped, or failed.
 
Requirements
 
None.
 
Restrictions
 
The length of time the job runs depends on the amount of data that is associated with the investments you select. If you have investments with a great amount of associated data, we recommend limiting the number of investments for a job run. We recommend running the job immediately after the upgrade before any modifications to investments, tasks, or transactions are made.
 
Parameters
 
  •  
    OBS Unit
    Specifies the OBS units whose investments you want to process. The job processes all investments for the OBS units and their descendants. Use this option if you do not want to select individual investments for processing.
  •  
    Investment
    Specifies the individual investments that you want to enable to display capital and operating expenses.
  •  
    Capital Charge Code
    Specifies the charge codes that you want to designate as the cost type Capital for the selected investments (based on the OBS unit or individual investments).
  •  
    Operating Charge Code
    Specifies the charge codes that you want to designate as operating cost for the selected investments (based on the OBS unit or individual investments).
  •  
    Capitalization Percent
    Specifies the amount of expense for an investment that is designated capital expense. This number is used to calculate operating and capital expense.
Execute a Process Job
This job executes a process that is not associated with any object.
 
Requirements
 
None
 
Restrictions
 
  • It can only execute the processes that you have access.
  • It can only execute non-object based processes (that is, processes without a primary object).
 
Parameters
 
  •  
    Process ID
    The ID of the process to execute.
Generate Invoices
This job takes a set of unprocessed transactions, matches them with best-fitting chargeback rules from the investment hierarchies, and applies the rule to generate chargeback transactions. An invoice header is generated for every unique combination of department and fiscal time period.
If an invoice exists for the department-fiscal time period combination, and if the invoice is locked, the job cannot generate new chargeback transactions against the invoice. Instead new chargeback transactions are created for an invoice in the next, unlocked fiscal time period.
Your finance manager can view or monitor any errors or warnings that are caused by running this job.
 
Requirements
 
  • Set up a financial structure, including entity, financial classes, and rate matrices.
  • Define credit and overhead rules.
  • Define investment debit or standard debit rules to process chargebacks.
  • Post the WIP transactions.
See 
Using:
 
Financial Management
 for more information.
 
Restrictions
 
This job cannot run concurrently with other instances of this job.
 
Parameters
 
  •  
    Entity
    Defines the entity for which to generate invoices.
  •  
    Regenerate
    Indicates if this job processes all transactions or only new and updated transactions.
    Options:
    • All. Regenerates the chargeback transactions by applying existing rules.
    • New Transactions/Adjustments. Processes only new unprocessed transactions or adjustments.
  •  
    Lock/Submit
    Indicates if all invoices from prior periods are automatically locked and submitted. If an invoice is already locked (for example, a user may be actively reviewing it), automatic submit may not occur.
    Options: 
    Lock/Submit or None
  •  
    Override Manual Locks
    Indicates if this job can temporarily unlock previously generated invoices and regenerate them.
Import Financial Actuals Job
This job updates task assignments with the actuals entered in financial transactions and WIP adjustments. The assignment ETC is decremented through the transaction entry date (as in a Fixed Loading Pattern). ETC in the future is not decremented even if the actual amount is greater than the ETC amount for the period being posted.
 
Requirements
 
None
 
Restrictions
 
This job cannot run concurrently with the following jobs:
  • Import Financial Actuals (Other instances of this job) cannot exist.
  • Delete Investments
 
Parameters
 
None
Index Contents and Documents for Searches Job
This job indexes the search content (such as activities and action items) and documents.
 
Requirements
 
None
 
Restrictions
 
This job cannot run concurrently with other instances of this job.
 
Parameters
 
None
Investment Allocation Job
This job updates the ETCSUM and EACSUM fields for investments and bases its calculations on the resources that are allocated to the investment. The job calculates the sum of the ETC values for all the assignments for the investment and stores the value in the ETCSUM
 
field. The job also calculates the sum of the EAC values for all the investment assignments and stores the value in the EACSUM
 
field. In addition, this job updates the total actuals for an investment.
You can run this job immediately or on a scheduled basis. Do not set a frequent schedule for the Investment Allocation job. The lists of investments can display the updated investment actuals and ETC in intervals that approach a near real-time appearance of precision. However, for better performance, we recommend that you run the job for reporting on a daily schedule, and not more frequently. This job is database intensive because it aggregates data from multiple tables and updates every investment in the system.
All users can navigate to one of the lists that are based on investment type as a common entry point into the properties of all investments. To optimize the performance of the application, limit the number of fields that you add to these investment list pages.
To view actuals and ETC at the project level, use a custom portlet. To achieve better real-time results, limit the data that you include in the portlet layout.
 When you run the Investment Allocation job in between incremental runs of the Rate Matrix Extraction job, the incremental option behaves in the same manner as a full execution. In this case, the Investment Allocation job updates all projects.
 
Requirements
 
None
 
Restrictions
 
None
 
Parameters
 
None
Load Data Warehouse Job
The Load Data Warehouse Job extracts data from the database, transforms the data to a denormalized format, and loads the data into the data warehouse. The frequency of this job determines the freshness of the data. This job populates dimensions, facts, and lookups in the data warehouse for stock objects and attributes and any custom objects and attributes that have been explicitly enabled for inclusion in the Data Warehouse. This job also updates the Advanced Reporting domains, if you have set up Advanced Reporting with Jaspersoft.
For new installations, this job is 
active. 
In upgraded environments, the job state matches its original pre-upgrade state.
Only the most commonly used stock objects and attributes are enabled for inclusion in the data warehouse. Enable all custom objects and attributes before running this job to populate them in the data warehouse.
: The data warehouse is not designed to be a real-time data reference. If you need to run reports on live data, we recommend that you create ad hoc views and reports based on your 
Clarity PPM
 data sources.
If you have any large string attributes that are enabled for the data warehouse, they cannot exceed 1,000 characters. The data warehouse includes only the first 1,000 characters from large string attributes and truncates the rest.
To populate the data warehouse with data from the 
Clarity PPM
 database, run the following jobs in the order given:
  • Time Slicing
  • Load Data Warehouse
 
Requirements
 
Run this job with the 
Full Load
 option selected in the following cases:
  • You perform an upgrade or install a patch on 
    Clarity PPM
    .
  • You add a new language in the data warehouse system options. Running an incremental load of this job does not update the data warehouse with the new language.
  • You delete an attribute or unselect the Include in the Data Warehouse option for the attribute in Studio. You cannot re-enable the attribute for inclusion in the data warehouse unless at least this job has completed one run.
  • You change the entity for fiscal periods in the data warehouse options.
  • You change the timeslice dates to include a larger timeframe.
  • You change the First Day of Work Week.
  • You change any of the settings in the System Options, Data Warehouse Options.
Also, run this job each time the data type of an attribute that is included in the data warehouse is changed in Studio. First disable the attribute, then run this job. Next, re-enable the attribute with the correct data type, and rerun this job for the changes to take effect.
Running this job with the 
Full Load
 option adds the following financial plans for applicable investments to the data warehouse:
  • The Plan of Record
  • The current budget plan
  • All benefit plans
 
Restrictions
 
While the Load Data Warehouse job runs, any concurrent user edits may temporarily be excluded from the data warehouse.
: The job copies records based on their last_updated_date attribute. The job identifies all the object instances modified 
before
 the job start date and time. This tip may sound obvious to experienced database administrators; however, users should be made aware that any record modified while the job runs is not going to be populated. A unique situation arises with investments because they appear in two tables: one for 
investments
 and one for 
projects
. If users are making concurrent updates, records in DWH_INV_INVESTMENT may not match in DWH_INV_PROJECT. The next instance of the full or incremental Load Data Warehouse job refreshes the data and corrects this common temporary condition.
 
Parameters
 
  •  
    Full Load
     
Rebuilds the data warehouse from scratch if selected. If the option is not selected, this job only looks for incremental changes since the last time you ran it. We recommend that you run the job with the Full Load option selected during off peak hours for better performance. At other times, run this job for incremental loads.
If you are running this job for the first time to populate the data warehouse, select the Full Load parameter. Also, if you had a failed run of the Load Data Warehouse job and you corrected the problem, select Full Load the first time that you run this job after the failure. For subsequent runs, you can run an incremental load by leaving the Full Load parameter unselected.
 
Database Size and Job Frequency
 
The size of your database determines how frequently you run this job.
  •  
    Full loads
     recreate the data warehouse from scratch. You can run this once a day during non-peak hours. For a global enterprise, you may decide to extend the frequency to once a week, or even longer.
  •  
    Incremental loads
     can be run more frequently, and the frequency is determined by the size of your database. The best way to determine the frequency is to run the incremental load and see how long it takes to complete. We recommend that you run incremental every 3-4 hours depending on the size of your database.
 
Custom Jobs and Processes
 
This job imports data into the data warehouse database based on the last_updated_date field on the object's instances. If you have any custom jobs and processes that are updating the last_updated_date field, we strongly advise you to reschedule those jobs and processes to run at a different time when the Load Data Warehouse job is not running. In addition, set the Load Data Warehouse job to be incompatible with your custom jobs and processes (Administration, Reports and Jobs). If you do not do this, data may not be updated in the data warehouse.
 
Server Time, Timezone, Date, and Time
 
To ensure the correct functionality and accuracy of data in 
Clarity PPM
 and all jobs, including the Load Data Warehouse Job, verify the following:
  • The server time is the same (preferable, down to the second) on the 
    Clarity PPM
     application server, 
    Clarity PPM
     database server, and Data Warehouse database server.
  • The timezone, date, and time is the same on the 
    Clarity PPM
     application server and database servers in the same environment. Do not have any differences.
This synchronization is necessary because the Load Data Warehouse job imports data into the Data Warehouse database based on the last_updated_date field on the object instances. If the date and time on the servers do not match, data may not be loaded into the Data Warehouse. For other jobs, if the date and time do not match, the job may not start. Or, the job may start later than expected, leading to inaccurate data.
: Before running this job, view the system health report. See Health Report, Job Monitor, Governor Settings, and Data Warehouse Audit Dashboard.
Load Data Warehouse Access Rights
The job extracts access rights for investments and resources from the 
Clarity PPM
 database and loads them into the 
Clarity PPM
 Data Warehouse.
For new installations, this job is 
active. 
In upgraded environments, the job state matches its original pre-upgrade state.
To populate the Data Warehouse with data from the database, run this job 
after
 running the Load Data Warehouse job and after running the Time Slicing job. You 
do not need
 to run this job each time after you run the Load Data Warehouse job.
LDAP - Synchronize New and Changed Users Job
This job synchronizes users that were added or modified in the LDAP server with the 
Clarity PPM
 user table.
 
Requirements
 
  • Configure LDAP to run this job.
  • Be an authenticated LDAP user to view this report.
 
Restrictions
 
This job cannot run concurrently with any other instances of this job.
 
Parameters
 
None
LDAP - Synchronize Obsolete Users Job
This job deactivates users who are marked as inactive, or removed from the LDAP server through the 
Clarity PPM
 user table.
To schedule this job, select LDAP as an Available Job filter.
 
Requirements
 
Configure LDAP to run this job.
 
Restrictions
 
This job cannot run concurrently with other instances of this job.
 
Parameters
 
None
: To learn more about LDAP settings in CSA, see CSA: Security, Passwords, LDAP, SSL, SSO, XSS, and FIPS.
Oracle Table Analyze Job
This job refreshes statistics that are used to determine the best path or execution for a query. The Oracle Table Analyze job gathers statistics, traditionally by estimate percent, and runs basic database optimization commands. By optimizing database statistics, queries can be opened more quickly resulting in general performance improvement. However, in previous releases, running this job by estimate percent was inefficient compared to running it by auto-sampling. 
 
Requirements
 
  • CA PPM must be configured with an Oracle database. 
  • You can analyze statistics under certain circumstances, such as when the schema or the data volume changed.
  • We recommend that you run this job at least once a week during off-peak hours or periods of light user activity, such as on a weekend night.
  • If you have a large database with many users updating multiple tables daily, run the job at least once daily.
 
Restrictions
 
  • You cannot run multiple concurrent instances of this job.
 
Parameters
 
None. In CA PPM 15.4.1, the Oracle Table Analyze job no longer includes the following legacy parameters:
  • Schema Name
  • Estimate Percent
The new stored procedure uses the following default values:
  •  
    USER
     for Schema Name
  •  
    AUTO_SAMPLE_SIZE
     for Estimate Percent 
The CMN_GATHER_TABLE_STATS procedure now uses AUTO_SAMPLE_SIZE and all references have been updated.
: Running 
Oracle Stats
 includes histogram stats that increase input-output consumption and paging memory. Because the CA PPM Oracle Table Analyze job does not gather histogram stats, it does not cause these performance issues. CA recommends using the CA PPM Oracle Table Analyze job.
Parallel Job Processor
Partners and developers often integrate solutions with 
Clarity PPM
. They typically map external system data to objects in 
Clarity PPM
. The preferred method of importing this data involves the XML Open Gateway (XOG). You can schedule a background server job to perform the import. In many cases, large volumes of data are imported using sequential jobs. To help speed these operations, a simple parallel job processor is now available to a limited audience.
: This job is not available for general use. It is restricted to certified partners, support, and services personnel.
The job divides the workload in a target job into multiple threads inside the same single job. It processes the separate threads and units of work in parallel to save time. After it is imported by XOG into 
Clarity PPM
, the job uses the job processor. The job processor passes the custom pre-processor and post-processor parameters to the job.
 
Requirements
 
  • The job requires a large number of work units to justify its configuration and use.
  • The units of work cannot be dependent on each other.
  • The pre-processor and post-processor implement the interfaces that you provide. Make the corresponding library available in the Clarity/PPM install path.
 
Restrictions
 
  • This job is limited to authorized administrators only. It is not available for general use.
  • This job cannot run concurrently with other instances of this job.
 
Parameters
 
  •  
    Pre-Processor: 
    The pre-processor implements the JobPreProcessor interface. The pre-processor gets the input data and converts it into different units of work that can be processed in parallel. 
     
  •  
    Post-Processor: 
    The post-processor implements the JobPostProcessor interface. The results/responses obtained by processing the units of work, are passed to the post-processor. The post-processor can analyze the results and take actions, like reporting. The post-processor needs to be implemented by the consumer of the job by implementing the interface provided. The default implementation parses the XOG responses and writes the status to a log file. Log should be configured in CSA for name 'com.ca.clarity.job'. It will also write the erroneous responses to the output file location, specified as a parameter to the Job.
  •  
    Input Parameters: 
    The input data can contain different types of objects and can be organized in a specific order. The default implementation reads the XOG import files from the file location specified by this job parameter.
  •  
    Output Parameters: 
    The output file location for the work units after they are processed.
  •  
    Batch Size: 
    The batch size is used to process the units of work in smaller segments or batches. When you do not specify or pass a batch size, the system uses one tenth of the total number of input items as the default batch size.
: You can also pass the custom pre-processor, post-processor, and batch parameters to the job.
Populate Resource Management Curves
The job is run after completing an upgrade to populate the time-scaled values (TSV) in the resource management views in the 
New User Experience
. The TSV columns in the resource management views are new. When you upgrade using existing data, these new TSV columns appear without allocation data. For example, if you add a forecast rate, the forecast amount is not calculated because the allocation data is not available.
Running the job populates the columns with the latest allocation data in the product.
The job is available for upgrades and is disabled out-of-the-box. The system administrator must activate the job and run it only once after an upgrade.
The job performs the following actions:
  • Updates time-scaled values for availability for all resource types except expenses.
  • Updates time-scaled values for both soft and hard allocation for investments.
Requirements
The new Resources UI in the 
New User Experience
 must be activated.
Restrictions
None
Parameters
None
Post Incident Financials Job
This financial processing job posts the transactions of incident effort entries to the general ledger account. Run this job when you want to track the cost of maintaining non-project investments in your organization.
 
Requirements
 
Enter effort for the incident.
 
Restrictions
 
This job cannot run concurrently with other instances of this job.
 
Parameters
 
  •  
    Effort From Date
    Defines the from-date to which effort has been posted for this job to process. Click the Select Date icon to select a specific date, or select a relative date from the drop-down.
  •  
    Effort To Date
    Defines the to-date to which effort has been posted for this job to process. Click the Select Date icon to select a specific date, or select a relative date from the drop-down.
Post Timesheets Job
The Post Timesheets job is a background process that compiles and posts the actual values into the project plan. This job runs at scheduled intervals. It generates the timesheet for the resource assignment that is based on the actual hours approved and the time with finish dates in the past (at least five minutes ago).
This job does the following:
  • Updates the resource assignment and the Transaction Import table with the actual values on the timesheets. The timesheet status changes to "Approved".
  • Advances Estimate To Complete (ETC) past the time period for the posted timesheets on all task assignments of the corresponding resources.
 
Requirements
 
  • Set up Timesheets.
  • Run the Rate Matrix Extraction job to ensure the rates and costs are current at the time the entries are posted to the project. Running this job first helps avoid incorrect project task assignment actual costs.
 
Restrictions
 
This job cannot run concurrently with the following jobs:
Post Timesheets (other instances of this job cannot exist)
  • Delete Investments
 
Parameters
 
None
Post to WIP Job
This job offers functionality that was previously only available on the Post to WIP page. That option required you to manually navigate to this page every time you wanted to run it. (Click Home, and under Financial Management, select Post to WIP.)
Use this job to automate 
Post to WIP
 functionality and ensure organizational financial information is up-to-date on a regular basis. To filter the data, use the Voucher Entry ID field. This job also provides the following benefits:
  • You can schedule the job to run in the future, even to run recursively.
  • You can post all the transactions based on a selected investment. This option is not available on the Post to WIP page.
  • The investment and resource lookups available in the job parameters display all the investments and resources. Processing does not check whether any transactions are currently waiting for Post to WIP. The page only displays resources and investments that are waiting to be posted. However, jobs can occur in the future. You have the flexibility of choosing any resources or investments. You are not restricted to viewing only transactions to be posted.
  • Refining and posting a particular transaction are much easier using this job. Every parameter that you select from top to bottom is a logical 
    and
     condition. For example, this OBS and this investment and this Resource and This Date and so on.
 
Parameters
 
Only the Post Options parameter is required. When you define no parameter values, the job posts all the transactions (including Timesheets) through today.
  •  
    Investment OBS
    Displays the Department OBS or Location OBS. The application includes the investments in that particular OBS unit and its descendants.
    Note
    : This parameter is not present on the 
    Post to WIP
     page.
  •  
    Investment
    Displays one or more investments from a list of all the financially active investments. It is not essential that at the time of scheduling the job, the investment has any posted transactions. Such a condition could prevent the administrator from autoscheduling the job.
  •  
    Resource
    Displays one or more financially enabled resources (not roles). This lookup is useful in the scenario to fix up all the items related to the resource costs of a specific contractor.
  •  
    Entry Type
    Displays the type of entries to include. For example, 
    All
    Imported
    PPM
    Voucher Expense
    , or 
    Voucher Other
    .
     
  •  
    Transaction Type
    Displays one or more types of transations to include. For example, 
    Labor
     or 
    Expense
    .
  •  
    Voucher Entry Number
    Displays one or more voucher entries based on your selection in the Entry Type field. Use this parameter to execute the job manually for a specific set of transactions on a voucher. Transactions from Timesheets do not have Voucher IDs. You can still post them using this job.
  •  
    From Transaction Date 
    and 
    To Transaction Date
    Displays the beginning and ending dates to define an optional range of time for filtering the transactions. When you select no dates, the job automatically groups all the transactions ready for posting to WIP.
  •  
    Post Options
    Indicates whether the job recalculates the data using updated exchange rates before it is posted.
 
Restrictions
 
This job exhibits the same behavior as the Post to WIP page. All the rules that apply to that page also apply to the job. The Post to WIP job and Post to WIP page are not compatible. Use one or the other but not both at the same time.
The Post to WIP job does not display the number of transactions that are about to get posted.
Post Transactions to Financial Job
This job verifies and transfers data from the Transaction Import tables to the Financial Management tables. This data could be the result of posted timesheets, or transactions that are imported from external systems.
 
Requirements
 
Set up Financial Management and Timesheets.
 
Restrictions
 
This job cannot run concurrently with the following jobs:
  • Post Transactions to Financials (other instances of this job cannot exist)
  • Datamart Extraction
  • Delete Investments
 
Parameters
 
  •  
    Transaction From Date
    Defines the from date to which transactions are posted for this job to process. Click the Select Date icon to define a specific date, or select a relative date from the drop-down.
  •  
    Transaction To Date
    Defines the to date to which transactions are posted for this job to process. Click the Select Date icon to define a specific date, or select a relative date from the drop-down.
    If you do not enter the To Date value, this job posts transactions up to the current date.
PPM Customization Discovery Analysis Job
This job is for use by CA Technologies to analyze customizations to determine environment complexity. This job generates a report that was designed for CA Technologies Global Delivery teams, CA Support, and advanced administrators when preparing to migrate an on-premise environment to a SaaS environment. Some customizations allowed in an on-premise environment might not be permitted in a SaaS environment. Non-compliant objects and other factors determine an estimated level of complexity on the report cover page.
The output contains about 20 tabs of data including non-compliant objects, data sources, and grid portlets. Discuss these customizations before upgrading or changing environments.
Requirements
To prepare this report, select the Active check box for the PPM Customization Discovery Analysis job. The job results in an Excel spreadsheet which can also be emailed to the recipient as specified in the job parameters at the time of execution.
Parameters
  •  
    Obfuscate Email Addresses
    : This setting applies to duplicate e-mail addresses in the Excel ouput file in the zipped discovery report. When you select this option, e-mail addresses are obfuscated. For example, [email protected] appears as su*********@ca.com
  •  
    Recipient Email Address
    : Enter one or more e-mail addresses separated with a comma or semi-colon. Each recipient receives a zip file containing the Excel ouput.
Purge Documents Job
This job permanently deletes documents.
 
Requirements
 
As an administrator, back up all documents that are stored in the product or in the Knowledge Store.
 
Restrictions
 
This job cannot run concurrently with other instances of itself.
 
Parameters
 
  • Purge All Documents for the Following Objects
  • [Or] Purge Documents and Versions Not Accessed for [n] Days
  • [Or] Retain the [n] Most Recent Versions and Purge the Prior Versions
  • All Projects
  • Project OBS
  • Specific Project
  • All Resources
  • Resource OBS
  • Specific Resource
  • All Companies
  • Company OBS
  • Specific Company
  • Knowledge Store
Purge Audit Trail Job
This job removes all audit trail records according to specified job settings. You can set this job to run immediately or according to a set date and time, and you can run it on a recurring schedule.
 
Parameters
 
None
Purge Financial Tables Job
This job permanently deletes all financial transactions for a specified project.
 
Requirements
 
  • (Recommended) Back up all financial transactions before running this job.
  • Grant the Financial Maintenance - Financial Management access right.
  • Verify that the project has a status of "Closed" for the transactions you want to purge.
For more information, see 
Project Management
.
 
Restrictions
 
This job cannot run concurrently with other instances of this job.
 
Parameters
 
Investment
Defines the name of the investment on which this job runs.
Purge Notifications Job
This job performs a bulk deletion of notifications.
 
Requirements
 
None
 
Restrictions
 
This job cannot run concurrently with other instances of this job.
 
Parameters
 
  •  
    Notification Type
    Defines the notification type to delete (for example, Timesheet). You can select multiple types. Click the Browse Notification Type icon (binoculars) and select the type. To delete all notification types, click the Select All checkbox.
  •  
    From Created Date, To Created Date
    Defines the date range during which the notifications were created. If you do not specify a date range, the job deletes all notifications for the selected type, assignee, and assignee OBS. Specify one of the following dates:
    • Specific Date - Enter a date or use the Calendar tool to select the date.
    • Relative Date - Select the appropriate relative date from the drop-down list.
  •  
    Assignee
    Defines the resource who receives the notifications. All resources (active, inactive, and locked) are listed. You can select multiple resources. 
  •  
    Assignee OBS
    Defines the Organizational Breakdown Structure (OBS) of the resource who receives the notifications. If you select the OBS and do not specify an assignee, the job deletes the notifications for all resources in the selected OBS.
Purge Report Output Job
This job permanently deletes the report output from the document management system based on the selected parameters.
 
Parameters
 
  • Format - This is a mandatory parameter, and by default, ALL is selected. This parameter has four values- ALL, PDF, PPTX, XLSX.  Execute this job with only the Report Type selected as ALL (with no other parameters selected), to permanently delete all saved report output from the document management system.
  • Report -  Report name given to the saved report output.
  • From Run Date and To Run Date - Select the date range (From Run Date and To Run Date) for which you want to permanently delete the saved report output. Ensure that both parameters are selected for a specific date range.
  • Run By - This parameter lets you search for the user who executed the job.
Purge Temporary Aggregated Data Job
This job cleans up the data that is created as a part of computing aggregated costs for portfolio management. This includes caching the values of aggregated costs and role demand for the investment hierarchy.
The job is scheduled to run once a day automatically, but can be run on demand too. Run this job for multiple changes in the investment hierarchy, or in the properties of multiple investments.
 
Requirements
 
None
 
Restrictions
 
This job cannot run concurrently with other instances of this job.
 
Parameters
 
  •  
    Purging Option
    Indicates if all temporary data or outdated data has been cleared. Clearing all temporary data cleans up all the temporary as well as cached data. Clearing outdated temporary data preserves the cached data that is still valid and deletes other temporary data.
Rate Matrix Extraction Job
This job extracts rate matrix information and refreshes the rate matrix extraction tables. Run this job each time the rate matrix and the financial properties of a project change. The Rate Matrix Extraction job is required before retrieving rates on projects. 
You can prepare and update the rate matrix data. While the job is running, you can use the rate matrix data.
: Preparing the rate matrix data takes more time than updating the rate matrix data. To minimize the duration for which the rate matrix data is unavailable, schedule two instances of the job, each with only one of the two parameters (‘Prepare Rate Matrix Data’ and ‘Update Rate Matrix Data’) selected. For better performance, select "Team Rates Only" option, along with the job instance which has ‘Prepare Rate Matrix Data’ selected.
The Rate Matrix Extraction functionality provides rates to the following system actions:
  • Posting Timesheets
  • Baselining a Project or Task
  • Updating Earned Value Totals using the user interface button or scheduling the job (the two methods yield the same result)
    • You can click 
      Update Cost Totals
       from the project team or task action menu; or
    • You can run the Update Earned Value and Cost Totals job
  • Updating Earned Value History
  • Opening a project in Open Workbench or Microsoft Project
We recommend scheduling the Rate Matrix Extraction job using the 
Incremental Update only
 option at intervals. This practice reduces the need for the application to extract the rates in real time for the actions. If the job refreshes the rates at regular intervals, the performance of the actions improves.
 
Requirements
 
Set up a default matrix or a matrix associated with the entity or project. See Set Up a Financial Cost and Rate Matrix.
Tip: In release 15.3, the dynamic runtime or 
on-the-fly
 rate matrix is disabled by default. To enable the runtime on-the-fly rate matrix, see Health Report, Job Monitor, Governor Settings, and Data Warehouse Audit Dashboard.
 
Restrictions
 
This job cannot run concurrently with other instances of this job. Delete all scheduled instances of this job and reschedule the job using the Prepare Rate Matrix Data parameter or the Update Rate Matrix Data parameter.
 
Parameters
 
  •  
    Extract Cost and Rate Information for the Scheduler
    Specifies extracting cost and rate information for a desktop scheduler. This field is a flag that triggers the job to generate resource rates for an investment that include rates prior to the start and after the finish date of the investment.
    Default:
     Cleared
  •  
    Prepare Rate Matrix Data
    Specifies adding the updated rate matrix data to a temporary table. The data in the NBI_PROJ_RES_RATES_AND_COSTS table stays intact. The rate matrix data is available.
     
    Default:
     Cleared
  •  
    Update Rate Matrix Data
    Specifies copying the updated rate matrix data from the temporary table to the NBI_PROJ_RES_RATES_AND_COSTS table. The rate matrix data is not available.
    Default:
     Cleared
  •  
    Incremental Update Only
    Specifies rates extraction only for currently updated projects. Running the job takes less time when compared with full run.
    Default:
     Selected
  •  
    Team Rates Only
    If this option is 
    NOT
     selected, then the job populates all the rates pertaining to the team and task assignments. If this option is selected, the job populates the rates related ONLY to the team. It does 
    not
     extract task assignment rates. If this option is 
    NOT
     selected, then the performance is unchanged. Using this option does not impact the stock product functionality which uses the project and team level rates only. It does not use task assignment level rates. 
    If you are using the rate matrix table for reporting or for querying the portlets and expect to see all assignment related rates, do not use this option.
     
    Default:
     Cleared, populates both team and task assignments.
    : For better performance, select this option in conjunction with ‘Prepare Rate Matrix Data’.
                            
Register New Investments for Enable Capitalization Job
After processing investments, the 
Enable Capitalization
 job sets each investment to 
S
 (Successfully Processed) or 
F
 (Failed). Investments with a status of 
S
 are not considered by the 
Enable Capitalization
 job again. As an administrator, you expect the 
Enable Capitalization
 job to be available at a later time and not just one time after upgrade. This job identifies investments that would otherwise be excluded and registers them for the 
Enable Capitalization
 job. 
You can run the 
Enable Capitalization
 job once after an upgrade to Release 13.2 or higher. That job enables the capitalization feature for existing investments that were created before Release 13.2. However, that job is not intended as an ongoing solution for bulk updates. The 
Register New Investments for Enable Capitalization
 job provides the functionality of the 
Enable Capitalization
 job at a later time even after upgrading from Release 13.1 and not just one time during the upgrade.
Beginning with Release 14.3, all investments created after an upgrade are available for processing by the 
Enable Capitalization
 job at any time. The new job stores investments created after an upgrade in the same temporary table that is the source for the 
Enable Capitalization
 job. After you run the 
Register New Investments for Enable Capitalization
 job, the table includes both the investments from any prior release not yet processed and the new investments created after the upgrade.
 
Requirements
 
To make post-upgrade bulk updates at a future time, perform the following sequence of steps:
  1. Perform a 
    Clarity PPM
     upgrade.
    Weeks or months are likely to pass before your organization is ready for the next step.
  2. Update the 
    Cost Type
     and 
    Team Capitalization %
     attributes for each investment. Also update the 
    Cost Type
     values at the task level.
  3. Run the following jobs:
    1.  
      Register New Investments for Enable Capitalization
       
    2.  
      Enable Capitalization
       
    3.  
      Copy Cost Plan of Record Charge Code with Cost Type
       
The
 Copy Cost Plan of Record Charge Code with Cost Type
 job copies the investment plans of record (POR) and adds Cost Type as an additional grouping attribute.
: Unexpected results could occur if you attempt to run these bulk updates on a recurring basis. After upgrading, when your organization is ready to use new 
Cost Type
 functionality, run the sequence of jobs as suggested.
Example: Investment with Financial Data
Use this example to create a copy of the cost plan of record. 
  1. Create an investment with financial data.
  2. Create a cost plan of record with charge code and cost type as grouping attributes.
  3. Create a detail row for the cost plan with 
    CC1
     as charge code and 
    Operating
     as the cost type.
  4. Run the 
    Register New Investments for Enable Capitalization
     job.
  5. Select the investment and run the 
    Enable Capitalization
     job.
  6. Run the 
    Copy Cost Plan of Record Charge Code with Cost Type
     job. Select the investment and the charge code (
    CC1
    ) as capital charge code.
Cost Type is not added again because it already exists in this new cost plan. On the detail page, the cost type is 
Capital
 because the user selected the 
CC1
 charge code as capital type charge code in the job. You can create multiple detail rows such as CC2 and CC3 and set their Cost Type to 
Capital
. However, those rows appear as 
Operating
 if you only select the CC1 charge code as Capital type for the cost type conversion job.
Example: Service Exception Error
In this example, the application does not create a copy of the cost plan of record. Use this example to troubleshoot the Service Exception error.
  1. Create an investment with financial data.
  2. Create a cost plan of record with charge code and cost type as grouping attributes.
  3. Create a detail row for the cost plan with the following settings:
    1. Charge Code
       of 
      CC1
      .
    2. Cost Type
       of 
      Capital
      .
  4. Create a second detail row for the cost plan with the following settings:
    1. Charge Code
       of 
      CC1
      .
    2. Cost Type
       of 
      Operaing
      .
  5.  
    Run the 
    Register New Investments for Enable Capitalization
     job.
     
  6. Select the investment and run the 
    Enable Capitalization
     job.
  7. Run the 
    Copy Cost Plan of Record Charge Code with Cost Type
     job. Select the investment and the charge code (
    CC1
    ) as capital charge code.
    The job has to set both cost type values to 
    Capital
    . The software does not allow duplicate rows in the detail so this job does not create a new cost plan.
The following message appears in the CSA logs for this job:
Error Msg : Service Exception Copy cost plan job results in creation of a duplicate detail entry.
The following message appears in the Job logs:
Execution of job failed
Example: Idea with Financial Data
In this example, the application does not create a cost plan. The 
Copy Cost Plan of Record Charge Code with Cost Type
 job is run on the same date for both an idea and a project. The project already has a cost plan with the same ID.
  1. Create an idea with financial data.
  2. Create a cost plan of record with charge code as a grouping attribute.
  3. Run the 
    Register New Investments for Enable Capitalization
     job.
  4. Select the idea and run the 
    Enable Capitalization
     job.
  5. Select the idea and run the 
    Copy Cost Plan of Record Charge Code with Cost Type
     job.
    The software creates a new copy of the cost plan of record and adds cost type again to the new cost plan.
  6. Convert the idea to a project and select the 
    Copy Financial Properties and Financial Plans
     check box.
    The software creates a new project and copies the financial properties from the idea.
  7. Run the 
    Register New Investments for Enable Capitalization
     job again.
  8. Select the converted project and run the 
    Enable Capitalization
     job.
  9. Select the converted project and run the 
    Copy Cost Plan of Record Charge Code with Cost Type
     job.
Remove Job Logs and Report Library Entries Job
This job removes old job log entries and report library entries from the database after they have exceeded a specified number of days.
Requirements
None
Restrictions
You cannot run this job concurrently with any other instance of the Remove Job logs job.
Parameters
  • Report age for delete
  • Job age for delete
Restore Domains Job
When upgrading Jaspersoft and importing advanced reporting content, the domain data can become corrupted. Run the Restore Domains job to restore the domains to their original state with any custom objects and attributes. By default, the Restore Domains job is in an 
Active
 state. If you disable the job and upgrade your Jaspersoft version, the Restore Domains job remains disabled.
 
Requirements
 
None
 
Restrictions
 
None
 
 
Parameters
 
  •  
    Content Pack
    Specifies the add-in that contains the domain data that you want to restore (for example, PMO Accelerator).
Send Action Item Reminder Job
Use this job to send action item reminder alerts, SMS, or email messages.
The removal of certain calendar functions in a previous generation of the product resulted in the removal of the 
Send Calendar Event Reminders
 job. However, this close sibling might still apply if your organization uses action items. You can open the Organizer and send a manual action item. You can also send a reminder at a specific time interval prior to the action item due date. For example, ten minutes, two hours, or three days before the action item is due.
This job notifies the recipients that the action item is due. The job adjusts any future reminders for recurring action items.
 
Requirements
 
One or more action items with reminders already set.
 
Restrictions
 
  • This job cannot run concurrently with other instances of this job.
  • The job checks if the reminder event time is before or concurrent with the present time. It also checks that the action item due date and time occurs at the present time or in the future. When these conditions are satisfied, the job sends a notification mail. The job does not send reminders for an action item which has already been completed.
  • As a best practice, run the job more frequently than the shortest reminder time you have in the system. If you have a reminder time of 1 day at the earliest, schedule this job to run every 6 hours. Reminders often need to reach people around the world in multiple time zones during their business hours.
 
Parameters
 
None. The job uses the properties of the action items.
Support Data Scan Job
The Support Data Scan job is available to assist administrators when troubleshooting issues with CA Support. This job provides performance diagnostics and data integrity metadata for analysis by CA Support technicians. The job also checks governor settings that exceed their default values.
The job title appears in the resulting email. Email messages to [email protected] with a subject line that begins with 
Case 123456
 are uploaded automatically to the specified support case with their attached job logs. Administrators can configure the job to send an e-mail message to a support engineer with the diagnostics attached. The job supports multiple email recipients separated by a comma or semicolon.
The diagnostic file has the following naming format:
PPMSupport-GeneralPerfAudit.tab
The name begins with 
PPMSupport
, is followed by the request type or class name, and ends in the .tab suffix.  The tab-delimited format can be viewed in Microsoft Excel. A copy of the file also appears in the logs directory. The following requests are available for execution:
  •  
    general_perf_check
     performs a check to determine if areas of the product are configured in a sub-optimal manner. 
  •  
    orphan_record_check
     scans some common tables to determine if there are orphan records. 
  •  
    scan_large_tables
     performs an analysis on some common tables to determine their record counts.
  •  
    oom_perf_check
     examines the configuration of the environment for areas that can contribute to high memory usage.
: Run these requests only under the direction of CA Support. The results may identify certain data conditions, but the conditions may not necessarily be a problem. CA Support is trained on how to interpret the results of these scans.
Synchronize Agile Central Job
The job synchronizes 
CA Agile Central
 portfolio items to 
Clarity PPM
 projects and tasks.
 
 
CA Agile Central
 to 
Clarity PPM
 Integration Direction
 
In the CA Agile Central to 
Clarity PPM
 integration direction, the job makes the following updates in the systems:
 
  • Creates top-level CA Agile Central portfolio items (levels P2-P5) based on the information in the CA PPM project. The job creates the portfolio items provided they do not already exist for the linked projects. 
  • Creates 
    Clarity PPM
     tasks from the lower level 
    CA Agile Central
     portfolio items (level P1) based on the information in the portfolio items. The job creates the 
    Clarity PPM
     tasks provided they do not already exist. The synchronization depends on the selection of the Create and Sync Tasks option while configuring the 
    CA Agile Central
     integration settings in 
    Clarity PPM
    .
  • Creates 
    Clarity PPM
     tasks for the user stories associated to the lower level 
    CA Agile Central
     portfolio items (level P1). The job creates the 
    Clarity PPM
     tasks provided they do not already exist. The synchronization depends on the selection of the Create and Sync Stories option while configuring the 
    Clarity PPM
     project. Also, the stories are synchronized provided the Create and Sync Task option is selected while configuring the 
    CA Agile Central
     integration settings in 
    Clarity PPM
    . In case a Time Tracking Project Template is selected for the integration, the time tracking tasks are also created as sub tasks at the same level as the user story tasks.
     In this integration direction, since you cannot specify which portfolio items (level P1) to synchronize with 
    Clarity PPM
    , you also cannot specify which user stories are brought over from 
    CA Agile Central
    . All stories for all the features that are synchronized with 
    Clarity PPM
     are brought over from 
    CA Agile Central
    .
  • Sets the initiative/feature ID of 
    CA Agile Central
     portfolio item on the synchronized project properties on the Agile Summary subpage of 
    Clarity PPM
    .
  • Populates agile attributes from CA Agile Central to 
    Clarity PPM
     on the Agile Summary subpage of integrated projects and tasks.
  • Creates 
    Clarity PPM
     subtasks underneath the lower level 
    CA Agile Central
     portfolio item. Values are preset from the Open for Time Entry, Charge Code, and Cost Type fields based on the project template that is associated with the Integration settings in 
    Clarity PPM
    .
  • Synchronizes 
    CA Agile Central
     projects to 
    Clarity PPM
     project teams provided 'Create and Sync Team' option is selected while configuring the 
    CA Agile Central
     integration settings in 
    Clarity PPM
    .
 
 
 
CA PPM to CA Agile Central Direction
 
In the 
Clarity PPM
 to 
CA Agile Central
 integration direction, the job makes the following updates in the systems:
  • Creates top-level CA Agile Central portfolio items (levels P2-P5) based on the information in the CA PPM project. The job creates the portfolio items provided they do not already exist for the linked projects.
  • Creates lower level CA Agile Central portfolio items (level P1) based on the information in the CA PPM tasks. The job creates the CA PPM tasks provided they do not already exist. A corresponding portfolio Item is created in CA Agile Central only for tasks marked as Synchronize. This also depends upon the selection of the Create and Sync Tasks option while configuring the CA Agile Central integration settings from CA PPM.
  • Creates 
    Clarity PPM
     tasks for the user stories associated to the lower level 
    CA Agile Central
     portfolio items (level P1). The job creates the 
    Clarity PPM
     tasks provided they do not already exist. The synchronization depends on the selection of the Create and Sync Stories option while configuring the 
    Clarity PPM
     project. Also, the stories are synchronized provided the Create and Sync Task option is selected while configuring the 
    CA Agile Central
     integration settings in 
    Clarity PPM
    . In case a Time Tracking Project Template is selected for the integration, the time tracking tasks are also created as sub tasks at the same level as the user story tasks.
     In this integration direction, you can select which tasks you want to push to 
    CA Agile Central
     as a portfolio item (level P1). Accordingly, only the user stories for these synchronized tasks are brought over from 
    CA Agile Central
     to 
    Clarity PPM
    .
  • Sets the Initiative/Feature ID of the CA Agile Central portfolio item on the synchronized project properties on the Agile Summary subpage of CA PPM.
  • Populates Portfolio Item Name, Planned Start Date, and Planned End Date agile attributes from CA PPM to CA Agile Central portfolio items. At the task level, Planned Start Date and Planned End Date are based on the task start and finish dates. If these dates are modified in CA PPM, the job overwrites the Planned Start Date and Planned End Date agile attributes in CA Agile Central.
  • Populates agile attributes from CA Agile Central to CA PPM on the Agile Summary subpage of integrated projects and tasks. At the task level, the task ID is not overwritten by the portfolio item ID.
  • Synchronizes CA Agile Central projects to CA PPM project teams provided the Create and Sync Team option is selected while configuring the CA Agile Central integration settings from CA PPM.
 
Bidirectional Synchronization
 
 In the Bidirectional integration, the job makes the following updates in the systems:
  • For each task marked as Synchronize in a  CA PPM project that is associated to a CA Agile Central portfolio item, creates a corresponding portfolio item in CA Agile Central.
  • For each portfolio item in CA Agile Central belonging to a parent portfolio item that is associated to a CA PPM project, creates a corresponding task in CA PPM.
  • If the Create and Sync Stories option is selected for a project, creates CA PPM tasks for the user stories associated to each portfolio item in CA Agile Central.
The job logs error messages in English language only. The project manager can also run the job from the Actions menu of an individual project. See Integrate CA PPM with CA Agile Central for details.
Requirements
The Agile add-in must be installed. See Integrate CA PPM with Agile Central.
Restrictions
The job cannot run concurrently with any other instance of itself or the following jobs:
  • Post Timesheets
  • Time Slicing
: Try to avoid scheduling the Synchronize Agile Central job during weekends when Agile Central typically performs system maintenance (you can check status at http://status.rallydev.com/). Connections could be lost during job execution and could possibly cause failures. Because the Project object action Synchronize Agile Central only updates one project at a time and finishes quickly, we recommend that you use that option instead to synchronize projects during the weekend.
Parameters
  •  
    Date Window: 
    Defines the date window for the updated projects that you want to synchronize with CA Agile Central. For example, if you select 
    Projects updated in the last 12 months
    , only the 
    Clarity PPM
     projects updated in the last 12 months are synchronized.
  •  
    Project Status: 
    Defines
     
    the status of the projects that you want to synchronize with CA Agile Central. For example, if you select 
    All Projects
    , both active and inactive projects are synchronized. 
Synchronize Jaspersoft Roles Job
This job ensures that users of 
Clarity PPM
 groups that have advance reporting rights and exist in Jaspersoft are matched to the corresponding roles in Jaspersoft. This job is driven by 
Clarity PPM
 groups. 
 This job does not affect designer roles such as ROLE_ADHOC_DESIGNER, ROLE_ADMINISTRATOR, ROLE_ANONYMOUS, ROLE_DASHBOARD_DESIGNER, ROLE_DATASOURCE_DESIGNER, ROLE_DOMAIN_DESIGNER, ROLE_REPORT_DESIGNER, ROLE_USER. Any designer roles, either available by default or created by users at the tenant level, will not be impacted by this job. If this job is executed without selecting the system option, then the job fails and an error message is displayed -"Enable the system option 'Allow Jaspersoft Role synchronization and execute the job again'.
The results of the Synchronize Jaspersoft Roles job vary based on the 
Clarity PPM
 group in the following scenarios:
  1.  
    The CA PPM group has a matching role in Jaspersoft
    : The users in Jaspersoft get overwritten based on the group users. If there is a group with users but no matching role in Jaspersoft, no users are transferred or impacted in Jaspersoft. 
  2.  
    The CA PPM group does not exist but the role exists in Jaspersoft
    : After the job completes successfully, the users with the assigned role in Jaspersoft are removed.
  3.  
    The CA PPM group exists with corresponding Jaspersoft roles and some users are removed from the CA PPM group
    : After the job completes successfully, the respective users are removed from the role in Jaspersoft. 
  4.  
    The CA PPM group is inactivated and changes are made to the group users
    : After the job completes successfully, all the roles in Jaspersoft are retained and there is no change to the user assigned to the roles in Jaspersoft. If the group is activated again, any removal and addition of users to the 
    Clarity PPM
     group are reflected in the Jaspersoft user roles.
  5.  
    The CA PPM group exists with corresponding roles in Jaspersoft and a user role is manually updated in Jaspersoft
    : After the job completes successfully, then the updated roles are added to the user in Jaspersoft as per the users in 
    Clarity PPM
     groups.
Requirements
For the job to match the corresponding users in 
Clarity PPM
 groups with the Jaspersoft roles, verify the following prerequisites:
  • In 
    Clarity PPM
    , assign Advanced Reporting access rights to each report user. Without rights, those users are not synchronized with Jaspersoft roles.
  • Add the report users to groups in 
    Clarity PPM
     and verify that the groups are 
    active
    .
  • Select the Allow Jaspersoft Role Synchronization check box on the System Options page.
  • To ensure all 
    Clarity PPM
     users exist in Jaspersoft, run the Create and Update Jaspersoft Users job before you run the Synchronize Jaspersoft Roles job. The 
    Clarity PPM
     access group ID is the same as the role name in Jaspersoft.
Restrictions
None
Parameters
None
Synchronize Portfolio Investments Job
This job synchronizes the portfolio planning data with the latest data from the actual investments. The update is based on a sync schedule that the portfolio manager defines in the portfolio properties. Whenever the job runs based on the sync schedule, the latest data from the actual investments is reflected in the portfolio.
 The job copies only those attributes from the actual investments that were registered to display on your portfolio investment pages and views. For more information about viewing the registered attributes or changing the list of registered attributes, see 
CA PPM Studio Development
.
Requirements
None
Restrictions
None
Parameters
None
Time Slicing Job
This job processes all configured time slices and updates discrete transactional data for the actual task assignment values, Estimate To Complete (ETC) and baselines, time sheet actuals, team and assignment data from a scenario, resource allocations to projects and resource availability.
 
Requirements
 
None
 
Restrictions
 
This job cannot run concurrently with other Time Slicing jobs. Other instances of this job cannot exist.
 
Parameters
 
None
Tomcat Access Log Import/Analyze Job
This job imports and analyzes Tomcat access log files from the local 
Clarity PPM
 environment (all app services), then stores and summarizes the data in designated tables (LOG_DETAILS, LOG_SUMMARY, LOG_FILES, LOG_REPORTDEFS). With the addition of custom portlets and queries or externally available content, this analysis data can provide details regarding the performance of the 
Clarity PPM
 system. If you are not running a Tomcat application server, the job runs but does not import any data.
 
Requirements
 
  • The system should have application access logs. To refresh the data in the portlet, you first run a job to export the data from the server and import it into the application database. 
  • For best results, schedule this job to run recurrently at an appropriate interval; for example, once every night.
 
Restrictions
 
None
 
Parameters
 
  •  
    Log Date
    Specifies the date for the access logs that are imported and analyzed. If no date is specified, the default date is yesterday.
Log Analysis Data Carried Forward After Data Migration
Symptom
If you are migrating data from one environment to another environment, the log data related to servers from the first environment appear in the second environment.
Solution
To prevent the log data of one environment from appearing in another environment after data migration, run the 
Delete Log Analysis Data
 job.
Example:
Consider a scenario where you have run a few jobs in your production environment and log tables are populated.
If you migrate your data from your production environment to a non-production environment, the servers from your production environment are listed in the non production environment logs and are visible  
The following procedure helps you to verify whether the logs belong to the current environment.
Follow these steps:
  1. Log in to Classic Clarity PPM.
  2. Navigate to 
    ADMINISTRATION
    Security and Diagnostics
    Log Analysis
    .
  3. Click the 
    Hostname
     drop-down list.
  4. Verify that the server name for the current environment is listed.
  5. If the 
    Hostname
     drop-down list displays the server name of the previous environment, run the job 
    Delete Log Analysis Data
 
Update % Complete Job
The 
Update % Complete
 job updates the percent (%) complete values whenever you change project or task data that affects the percent complete calculation. This job is only run if the % Complete Calculation Method is set to Duration or Effort. This is a field on the project scheduling properties page. Schedule this job to update the percent complete values automatically.
In addition, the following operations trigger this job to run:
  • Publishing the tentative schedule.
  • Posting actuals and distributing them to the project plan.
This job is scheduled to run automatically every 30 minutes, but you can also run this job on demand.
 
Requirements
 
For best results, schedule this job to run recurrently at an appropriate interval; for example, every 10 minutes.
 
Restrictions
 
None
 
Parameters
 
None
Update Aggregated Data Job
Use the Update Aggregated Data job to flatten the percentage allocations between investments. Run this job for multiple changes in investment data.
This job is scheduled to run automatically every 10 minutes, but you can also run this job on demand. Do not decrease the frequency of this job, as many objects depend on this job to get the flattened view of percentage allocation between the investments.
Run this job successfully before running the Generate Invoices job If using Chargebacks. Run this job against the current planned cost data if using Portfolio Management. This job is required to show the planned cost and budget data for all portfolios and included investments.
To improve the performance of this job and to avoid database contention, make the following jobs incompatible:
  • Datamart Rollup - Time Facts and Time Summary Job
  • Datamart Extraction Job
  • Rate Matrix Extraction Job
  • Oracle Table Analyze Job
  • Time Slicing Job
 
Requirements
 
None
 
Restrictions
 
This job cannot run concurrently with any other instance of itself nor the Generate Invoices job.
 
Parameters
 
None
Update Allocation from Estimates Job
The Update Allocation from Estimates Job updates team allocation to match remaining ETC starting from team member ActThrough date.
 
Best Practice:
 This job processes team and task data and so run it during off-peak hours. Scheduling this job during normal work hours can affect the overall system performance.
 
Restrictions
 
This job is available only for projects. The job updates the allocation values only for active projects and ignores all inactive projects.
 
Parameters
 
  •  
    Project Name
    Indicates the active project to which the user has access rights to edit. The job processes the selected project.
  •  
    Manager: 
    Indicates the project/investment manager to which the user has access rights to view resource. The job processes all active projects which are assigned to the selected manager.
  •  
    Investment OBS: 
    Indicates the Investment OBS Unit to which the user has access rights to view OBS. The job processes all active projects that are associated with the selected OBS unit, based on the OBS Mode setting.
  •  
    OBS Mode: 
    Determines which branches of the selected OBS structure are processed when selecting investments using an OBS Unit.
    Values:
     Unit Only, Unit and Ancestors, Unit and Descendants
    Default: 
    Unit Only.
 
Error Handling
 
The user that schedules these jobs may not have the appropriate security rights to all the resources on the investments.
When making resource staffing decisions, always have the manager with access rights make allocation and ETC decisions for their resources.
If the scheduler of the job does not have the appropriate access rights, the following action happens:
  • The job proceeds through all resources on the designated investments.
  • The job logs the error in the jobs log file and includes the following information:
    • Investment Name
    • Team Member Name
    • Error Message
Update Data Warehouse Trend Job
This job updates an existing snapshot of your trending data. When it runs, this job deletes the previous trend snapshot data and then re-creates a new snapshot in its place.
Requirements
Run the Load Data Warehouse job before running this job.
Parameters
 
Trend
: All the snapshots of the selected trend are refreshed.
Update Report Tables Job
The Update Report Tables job is required for reports that display any of the following: monthly or weekly calendar periods; FTE amounts; and OBS level or phase grouping options. It is only required for Advanced Reporting reports that display resource skill relationships. The job populates reporting tables based on the parameters that you select when running the job. If these tables are not populated, reports dependent upon them display a 'No Data Found' message. Schedule the job to run nightly to keep the reporting tables up to date.
 
Requirements
 
None
 
Parameters
 
The following are the available parameters for this job, each one populating a different table:
  •  
    Update Reporting Calendar 
    Populates the calendar table (rpt_calendar) that stores date ranges for daily, weekly, monthly and quarterly calendar periods, and the FTE for the date range. The start day of the weekly periods is determined by the First Day of Work Week field set in the Administration tool under the Project Management settings. Selecting this option populates the table five years back and five years forward, based on the current date. For example, if you run the job in October of 2015, the table is populated with periods from October 2010 through October 2020. Most of the reports displaying data by calendar period reference this table. The reports display a 'No Data Found' message if this table is not populated. Run the job with this option selected at least once a month and if the availability of the admin resource changes. The admin resource calendar determines the FTE calculation. If you are using Advanced Reporting, this parameter is not applicable, so you can leave it unchecked. In the data warehouse, the rpt_calendar table is replaced by the dwh_cmn_period calendar table.
  •  
    Update Investment Hierarchy 
    Populates the investment hierarchy table (rpt_inv_hierarchy) that stores up to ten levels of investment hierarchical relationships and hierarchy allocation percentages. The investment hierarchy table is used by reports in the 
    Clarity PPM
     solution pack. The parameter also populates the program hierarchy table (rpt_program_hierarchy) that stores up to five levels of program and project hierarchical relationships. The program hierarchy table is not used by any reports in the solution pack. If you are using Advanced Reporting, this parameter is not applicable, so you can leave it unchecked. In the data warehouse the rpt_inv_hierarchy table is replaced by the investment hierarchy table (dwh_inv_hierarchy).
  •  
    Update WBS Index 
    Populates the WBS index table (rpt_wbsindex) that stores relationships between phases and tasks. The option allows an incremental update so you can schedule the job to run frequently (for example, once every hour) if necessary. If you are using Advanced Reporting, this parameter is not applicable, so you can leave it unchecked. In the data warehouse the rpt_wbsindex table is replaced by the task hierarchy table (dwh_inv_task_hierarchy).
  •  
    Update Resource Skills Index 
    Populates the resource skills tables (rpt_res_skills_index and rpt_res_skills_flat) that store relationships between resource skills and their parent skills. The job supports up to ten levels in the skills hierarchy. Run the job with this option selected when you create, delete, rename, or modify a skill in in the Administration tool under Data Administration, Skills Hierarchy. You do not need to run the job after associating skills to resources. If you are using Advanced Reporting, this parameter is still applicable, so select it.
  •  
    Update OBS 
    Populates the data mart OBS table (nbi_dim_obs) that stores OBS unit information up to ten levels. The table is used in some reports for grouping by OBS level. Run the job with this option selected nightly or if there are changes to the OBS structure. If you are using Advanced Reporting, this parameter is not applicable, so you can leave it unchecked. In the data warehouse the nbi_dim_obs table is replaced by the OBS Unit lookup table (dwh_lkp_obs_unit).
Update Earned Value History Job
The Update Earned Value History job calculates earned value for a project or set of projects and creates earned value snapshots of the time sliced data. This data is based on the earned value reporting period that is assigned to the project and the parameters that you select. The earned value snapshot is used for historical earned value analysis (EVA) and reporting. The snapshots are stored in rows in the PRJ_EV_HISTORY (earned value history) table. You can use this reporting data in reports.
This job invokes the Update % Complete job on a recurring schedule that is based on how often your organization reports on your earned value data. This job uses the lag value to determine the day to take the snapshot. A snapshot is taken on the first day following the defined lag, providing you do not run the job on that day.
 
Example: Monthly with Three Day Lag
 
If you schedule this job to run monthly starting 2/1/15 with a lag of three days and you have associated the project to an earned value reporting period whose period type is defined as Monthly and frequency is the first day of the month, a snapshot for January 2015 is generated only when the job runs on 2/04/15 or later.
For each project that meets the job parameter criteria, this job:
  • Finds the project associated earned value reporting period and saves the project tasks earned value data based on that period.
  • Locks the project Earned Value Reporting Period field.
For more information, see Project Management.
 
Requirements
 
To create a historical snapshot, the project requires a current baseline. The project also requires an association with a valid historical earned value reporting period. No data is generated for new period definitions until they have occurred. Only the 
Period Number = 0
 records are updated until the project is associated with at least one past period type definition. If you associate a new period type definition with investments and run the job, it generates no rows in the PRJ_EV_HISTORY table for past periods. However, you can wait until time has transpired and the current period is now in the past. When you next run the job, the periods are created for that project. 
 
Restrictions
 
You cannot run more than one instance of this job at the same time.
 
Parameters
 
The following parameters are provided:
  •  
    Investment: 
    Defines the name of the investment on which this job runs.
  •  
    OBS Unit: 
    Defines the name of the OBS Unit for the project on which this job runs.
  •  
    Investment Manager: 
    Specifies the name of the resource managing the project.
  •  
    Lag: 
    Determines the number of days to wait before taking the snapshot. Use this setting to defer taking a historical snapshot so that your organization can reconcile the actual values from one system to another.
  •  
    Rewrite Existing Snapshot: 
    Indicates for the job to regenerate the current reporting period snapshot and replace the existing current periodic snapshot with updated data. When cleared, the projects having periodic snapshots are ignored.
    Default:
     Cleared
  •  
    Show Projected ACWP: 
    Indicates for the job to create data for the projected actual cost of work performed (ACWP) of all level-1 tasks in the work breakdown structure (WBS) for a project.
    Default:
     Cleared (disabled)
  •  
    Show Projected BCWP: 
    Indicates whether you want this job to create data for the projected budgeted cost of work performed (BCWP) of all level-1 tasks in the work breakdown structure (WBS) for a project.
    Default:
     Cleared (disabled)
  •  
    Show Projected BCWS: 
    Indicates for the job to create data for the projected budgeted cost of work scheduled (BCWS) as of the date for projects and project tasks.
    Default:
     Cleared (disabled)
If you do not complete any of the parameters, all investments are processed.
Update Earned Value and Cost Totals Job
This job calculates the earned value and costs for projects and NPIOs. You can select the investment using the Browse Investment field on the Job Properties page.
This job tracks investment progress by calculating earned value and updating costs, and invokes the Update % Complete job before it runs. The current earned value totals are calculated and recorded through the current date for one or more investments. The data is stored in a reserved row in the PRJ_EV_HISTORY (earned value history) table. The saved current earned value data totals appear in fields on investments and tasks.
 When no specific Investment for the job is selected, it processes all the active investments i.e.it will skip inactive investments.  However, the job parameters for selecting one investment do allow you to select an inactive investment and it will run for the selected investment and update the costs. Also, invoking the job from the inactive investment, Team, [Actions] 'Update Cost Totals' link will update the costs for that inactive investment. 
This job is scheduled to run regularly. You can schedule this job to run in the background. You can invoke this job on demand for an investment by selecting Update Cost Totals from the Actions menu. For more information, see 
Project Management
.
Restrictions
This job cannot run concurrently with any other instance of the Update Earned Value and Cost Totals job.
Parameters
  •  
    Investment:
     Select one active or inactive investment. Without a selected investment, the job processes all active investments.
  •  
    OBS Unit: 
    Defines the name of the OBS Unit for the project on which this job runs.
  •  
    Investment Manager: 
    Specifies the name of the resource managing the investment.
A large number of team members on investments can impact job performance. We recommend that you define the job using parameters to reduce the volume of updated data.
Update Estimates from Allocations Job
Updates the Effort Task ETC to match Allocation starting from team member Act through date. This job can be run for both NPIOs and projects.
Restrictions
  • Projects without an Effort Task are not processed. The job updates the ETC values only for active investments and ignores all inactive investments.
  • This job processes a high-volume of team and task data. Run it during off-peak hours. Scheduling this job during normal work hours can affect overall system performance.
 
Parameters
 
  •  
    Investment
    Indicates the active investment to which the user has access rights to edit. The job processes the selected investment.
  •  
    Manager
    Indicates the project or investment manager to which the user has access rights to view resource. The job processes all active investments which are assigned to the selected manager.
  •  
    Investment OBS
    Indicates the investment OBS Unit to which the user has access rights to view OBS. The job processes all active investments that are associated with the selected OBS unit, based on the OBS Mode setting.
  •  
    OBS Mode
    Determines which branches of the selected OBS structure are processed when selecting investments using an OBS Unit.
    Values:
     Unit Only, Unit and Ancestors, Unit and Descendants
    Default: 
    Unit Only.
  •  
    Investment Type:
    Indicates the investment type: Applications, Assets, Ideas, Other Work, Portfolios, Programs, Services, Projects, and Products.
 
Error Handling
 
The user that schedules these jobs may not have the appropriate security rights to all the resources on the investments.
When handling staffing decisions, we recommend that the appropriate manager (with the appropriate access rights) makes the allocation and ETC decisions for their resources.
If the scheduler of the job does not have the appropriate access rights, the following action happens:
  • The job proceeds through all resources on the designated investments.
  • The job logs the error in the jobs log file and includes the following information:
    • Investment Name
    • Team Member Name
    • Error Message
    The log file, App-niku-xbl.log is located at <Clarity_Runtime>\logs
Validate Process Definitions Job
This job checks for the integrity of a process. For example, if a sub process called by the process is active, or if a step action condition is valid. This job can be useful when you use process definition XOG to import many process definitions. All process definitions that are imported into the target system are not validated and in draft mode. You can then run this job to batch validate and activate process definitions.
In various cases, certain processes can be invalidated. For example, during ODF object deletion, object attribute deletion, or process deactivation. You can schedule this job regularly to validate the process definitions.
Restrictions
None
Requirements
Validate process definitions that the login user has Process Definition - View access rights. Optionally activate the process definitions when they validate.
Parameters
  •  
    Activate Process If Validated
    When enabled and Process Status is Validated, the job automatically activates the process definitions.
    Default:
     Cleared
  •  
    Process Status
    Specifies the status of process definitions that are imported into the target system.
    Values:
    • Errors Encountered
    • Not Validated
    • Re-validation Required
    • Validated