How to Set up the Cluster Environment for Unified Self-Service

For the optimal performance, set up Unified Self-Service on a cluster environment. In an ideal cluster environment, multiple server instances with identical configuration span across multiple nodes. All instances in a cluster work together to provide high availability, reliability, and scalability.
casm1401
For the optimal performance, set up Unified Self-Service on a cluster environment. In an ideal cluster environment, multiple server instances with identical configuration span across multiple nodes. All instances in a cluster work together to provide high availability, reliability, and scalability.
You need the following minimum requirements for the cluster environment:
  • Two machines to set up the Unified Self-Service nodes
  • One machine to set up Solr 1.4.1, Notification Server, and Shared Repository
  • One Database server
  • One machine for Load Balancer
The following diagram shows an ideal cluster environment for Unified Self-Service:
 
ca unified cluster environment
ca unified cluster environment
Follow these steps:
Set up and Configure Node 1
Set up Unified Self-Service on node 1 and configure it for the cluster environment.
Follow these steps:
  1. Install Unified Self-Service on a machine or node 1. Ensure that you give the Database Host as the machine name.
  2. Onboard a tenant and configure the CA SDM data source. For more information, see Onboard Tenants.
  3. Shutdown Unified Self-Service in services.msc.
  4. Open the portal-ext.properties file located in the OSOP folder of the Unified Self-Service installation directory.
  5. Append the file with the following lines:
    #cluster cluster.link.enabled=true ehcache.cluster.link.replication.enabled=true lucene.replicate.write=false net.sf.ehcache.configurationResourceName=/custom-ehcache/hibernate-clustered.xml ehcache.multi.vm.config.location=/custom-ehcache/liferay-multi-vm-clustered.xml
  6. Extract the custom-ehcache.zip file from the OSOP\tools folder to the OSOP\tomcat-7.0.40\webapps\ROOT\WEB-INF\classes\ folder.
    Ensure that after the extraction, the custom-ehcache folder is created in the same path as the
    hibernate-clustered.xml
    and
    liferay-multi-vm-clustered.xml
    files.
  7. Open the server.xml file from the OSOP\tomcat-7.0.40\conf\ folder.
  8. Replace <Engine name="Catalina" defaultHost="localhost"> with <Engine name="Catalina" defaultHost="localhost" jvmRoute="node1">.
  9. Restart Unified Self-Service services.
  10. Configure the common repository for attachments on the Unified Self-Service common server. For more information, see the Liferay documentation.
  11. Copy the OSOP\jetty-7.2.2.v20101205 folder to the Unified Self-Service common server to configure the notification server on the common Unified Self-Service server.
  12. Configure the notification server on the Unified Self-Service node:
    1. Open portal-ext.properties file and search for 
      cometd
      .
    2. Replace localhost with the hostname or IP address of Unified Self-Service common server and port with the port number on which Jetty is running. For example,
      #cometd configurations begin cometd.enable=true cometd.contextPath=/notification-server #internal properties are used by java client cometd.internal.host=localhost cometd.internal.port=18686 cometd.internal.protocol=http #external properties are used by jquery clients cometd.external.host=localhost cometd.external.port=18686 cometd.external.protocol=http #cometd configurations end
    3. Stop Unified Self-Service Jetty server from service.msc.
    Unified Self-Service node 1 is configured.
Install and Configure Apache Solr
Install Apache Solr on the Unified Self-Service common server.
Follow these steps:
  1. Download Solr 1.4.1 on the Unified Self-Service common server. For more information, see any Solr website, example, http://archive.apache.org/dist/lucene/solr/1.4.1.
  2. From the Solr distribution, copy example folder so that Unified Self-Service is also at the same level.
  3. Define the environment variable as the location for Solr to store the search index.
    Example:
     $SOLR_HOME=/openspace/solr
    This environment variable can be defined on the start up sequence of the operating system, in the environment for the user who is logged in, or in the start-up script for your application server. If you are using Tomcat to host Solr, modify setenv.sh or setenv.bat and add the environment variable there.
  4. Use this environment variable as a parameter for JVM during the start up configuration of the application server.
    If you are using Tomcat, edit catalina.sh or catalina.bat and add the -Dsolr.solr.home=$SOLR_HOME to the $JAVA_OPTS variable.
  5. Install Solr on the Unified Self-Service common server. For more information, see http://lucene.apache.org/solr.
    Install the Solr search engine on a separate machine from Liferay. To integrate Solr search engine with Liferay, restart the application server.
  6. Shut down Solr.
  7. From the Solr distribution, copy 
    solr.war
     to the webapps directory of your servlet container.
  8. Start Solr on the Jetty server:
    1. Go to $SOLR_HOME on the command prompt.
    2. Run java -Dsolr.solr.home=/openspace/solr -jar start.jar to set the java system property solr.solr.home.
Configure Liferay to use Solr for Searching
Do not run Liferay and Solr search engine on the same machine.
Follow these steps:
  1. Install the Solr Liferay plugin on Unified Self-Service node 1.
  2. Copy solr-web.war file from OSOP\tools to the OSOP\deploy folder of the Unified Self-Service installation directory.
  3. Open solr-spring.xml from the WEB-INF/classes/META-INF folder.
  4. Modify the following value to point to the server where Solr is up and running:
    <constructor-arg type="java.lang.String" value="http://localhost:8080/solr" />
  5. Save the solr-spring.xml file and place it back in the plugin archive.
  6. Copy the schema.xml file from the extracted solr-web folder (docroot/WEB-INF/conf) to $SOLR_HOME/conf folder of the Unified Self-Service common server.
  7. Restart Solr on the Unified Self-Service common server.
    Liferay server search is now upgraded to use Solr.
  8. From Unified Self-Service, select Control Panel, Server, Server Administration.
  9. Click Execute button next to reindex all search indexes at the bottom of the page.
    Liferay will begin sending indexing requests to Solr for execution. Once Solr has indexed the data, a search server runs independently of all the Unified Self-Service nodes.The Unified Self-Service search now uses the Solr as the search index. This is ideal for a clustered environment, as it allows all the Unified Self-Service nodes to share one search server and one search index, and this search server operates independently of all the nodes.
Set up and Configure Node 2
Set up and configure Unified Self-Service on node 2 for the cluster environment.
Follow these steps:
  1. Stop the Unified Self-Service services on node 1.
  2. Create backup of node 1:
    • (Windows) Run TakeBackup.bat from the bin folder of the Unified Self-Service installation directory.
    • (Linux) Run TakeBackup.sh
    Backup is created in Unified Self-Service_installation\CAOpenSpaceBackup.car file of the Unified Self-Service installation directory.
  3. On node 2, install the similar configuration Unified Self-Service as set up in node 1:
    1. Select Use existing database option.
    2. Ensure that you use the same database host name as used during the Unified Self-Service node1 setup.
    3. Select node1 backup CAOpenSpaceBackup.car.
      Note:
       Ensure user has all the privileges according to the basic installation.
    4. Ensure that you extracted the custom-ehcache.zip file in the OSOP\tomcat-7.0.40\webapps\ROOT\WEB-INF\classes\ folder of the Unified Self-Service installation directory.
      Ensure that after the extraction, the custom-ehcache folder is created in the same path as the
      hibernate-clustered.xml
      and
      liferay-multi-vm-clustered.xml
      files.
    5. Open the server.xml file from the OSOP\tomcat-7.0.40\conf\ folder and replace <Engine name="Catalina" defaultHost="localhost"> with <Engine name="Catalina" defaultHost="localhost" jvmRoute="node2">.
    After the successful configuration, two Unified Self-Service nodes run with the same database in a cluster.
Configure the Load Balancer
Configure the load balancer to increase scalability and to maintain performance.
Follow these steps:
  1. Download and install Apache HTTP server 2.2.
  2. Download mod_jk.so and copy it to APACHE_HOME\modules\ directory.
  3. Modify the APACHE_HOME\conf\httpd.conf file:
    • Append the file with the following lines:
      JkWorkersFile conf/workers.properties JkLogFile logs/mod_jk.log JkLogLevel info JkLogStampFormat "[%a %b %d %H:%M:%S %Y]" JkMount /* loadbalancer
    • Add the following entry in the httpd.conf file (if not added ):
      # Load the mod_jk connector LoadModule jk_module modules/mod_jk.so
  4. Define the IP addresses of node1 and node 2 in the APACHE_HOME\conf\workers.properties file:
    worker.node1.host = <IP ADDRESS OF NODE1> worker.node2.host = <IP ADDRESS OF NODE2>
  5. Start load balancer.
  6. On the client machine (from where you are accessing Unified Self-Service) modify C:\Windows\System32\drivers\etc\host file to point to the IP address of the Load Balancer. For example, if you on boarded a tenant with virtual host as test.openspace.com and the IP address of the Apache server is 10.11.12.13, then you have to add 10.11.12.13 test.openspace.com line in the host file. This allows access to the virtual host from the client machine.
    Load balancer is configured.
  7. Stop the Unified Self-Service services on node 1 and verify that requests are getting mapped to node 2 even if node 1 is down.