Your address will show here +12 34 56 78
2023 Blog, DevOps Blog, Blog, Featured

In today’s complex regulatory landscape, organizations across industries are required to comply with various regulations, including the Sarbanes-Oxley Act (SOX). SOX compliance ensures accountability and transparency in financial reporting, protecting investors and the integrity of the financial markets. However, manual compliance processes can be time-consuming, error-prone, and costly.

Relevance Lab’s RLCatalyst and RPA solutions provides a comprehensive suite of automation capabilities that can streamline and simplify the SOX compliance process. Organizations can achieve better quality, velocity, and ROI tracking, while saving significant time and effort.

SOX Compliance Dependencies on User Onboarding & Offboarding
Given the current situation while many employees are working from home or remote areas, there is an increased challenge of managing resources or time. Being relevant to the topic, on user provisioning, there are risks like, identification of unauthorized access to the system for individual users based on the roles or responsibility.

Most organization follow a defined process in user provisioning like, sending a user access request with relevant details including:

  • Username
  • User Type
  • Application
  • Roles
  • Obtaining line manager approval
  • Application owner approval

Based on the policy requirement and finally the IT providing an access grant. Several organizations have been still following a manual process, thereby causing a security risk.

In such a situation automation plays an important role. Automation has helped in reduction of manual work, labor cost, dependency/reliance of resource and time management. An automation process built with proper design, tools, and security reduces the risk of material misstatement, unauthorized access, fraudulent activity, and time management. Usage of ServiceNow has also helped in tracking and archiving of evidence (evidence repository) essential for Compliance. Effective Compliance results in better business performance.

RPA Solutions for SOX Compliance
Robotic process automation (RPA) is quickly becoming a requirement in every industry looking to eliminate repetitive, manual work through automation and behavior mimicry. This will reduce the company’s use of resources, save money and time, and improve the accuracy and standard of work being done. Many businesses are currently not taking use of the potential of deploying RPAs in the IT Compliance Process due to barriers including lack of knowledge, the absence of a standardized methodology, or carrying out these operations in a conventional manner.

Below are the areas which we need to focus on:

  • Standardization of Process: There are chances to standardize SOX compliance techniques, frameworks, controls, and processes even though every organization is diverse and uses different technology and processes. Around 30% of the environment in a typical organization may be deemed high-risk, whereas the remaining 70% is medium- to low-risk. To improve the efficiency of the compliance process, a large portion of the paperwork, testing, and reporting related to that 70 percent can be standardized. This would make it possible to concentrate more resources on high-risk locations.
  • Automation & Analytics: Opportunities to add robotic process automation (RPA), continuous control monitoring, analytics, and other technology grow as compliance processes become more mainstream. These prospective SOX automation technologies not only have the potential to increase productivity and save costs, but they also offer a new viewpoint on the compliance process by allowing businesses to gain insights from the data.


How Automation Can Reduce Compliance Costs?


  • Shortening the duration and effort needed to complete SOX compliance requirements: Many of the time-consuming and repetitive SOX compliance procedures, including data collection, reconciliation, and reporting, can be automated. This can free up your team to focus on more strategic and value-added activities.
  • Enhancing the precision and completeness of data related to SOX compliance: Automation can aid in enhancing the precision and thoroughness of SOX compliance data by lowering the possibility of human error. Automation can also aid in ensuring that information regarding SOX compliance is gathered and examined in a timely and consistent manner.
  • Recognizing and addressing SOX compliance concerns faster: By giving you access to real-time information about your organization’s controls and procedures, automation can help you detect and address SOX compliance concerns more rapidly. By doing this, you can prevent expensive and disruptive compliance failures.

Automating SOX Compliance using RLCatalyst:
Relevance Lab’s RLCatalyst platform provides a comprehensive suite of automation capabilities that can streamline and simplify the SOX compliance process. By leveraging RLCatalyst, organizations can achieve better quality, velocity, and ROI tracking, while saving significant time and effort.



  • Continuous Monitoring: RLCatalyst enables continuous monitoring of controls, ensuring that any deviations or non-compliance issues are identified in real-time. This proactive approach helps organizations stay ahead of potential compliance risks and take immediate corrective actions.
  • Documentation and Evidence Management: RLCatalyst’s automation capabilities facilitate the seamless documentation and management of evidence required for SOX compliance. This includes capturing screenshots, logs, and other relevant data, ensuring a clear audit trail for compliance purposes.
  • Workflow Automation: RLCatalyst’s workflow automation capabilities enable organizations to automate and streamline the entire compliance process, from control testing to remediation. This eliminates manual errors and ensures consistent adherence to compliance requirements.
  • Reporting and Analytics: RLCatalyst provides powerful reporting and analytics features that enable organizations to gain valuable insights into their compliance status. Customizable dashboards, real-time analytics, and automated reporting help stakeholders make data-driven decisions and meet compliance obligations more effectively.

Example – User Access Management


Risk Control Manual Automation
Unauthorized users are granted access to applicable logical access layers. Key financial data/programs are intentionally or unintentionally modified. New and modified user access to the software is approved by authorized approval as per the company IT policy. All access is appropriately provisioned. Access to the system is provided manually by IT team based on the approval given as per the IT policy and roles and responsibility requested.

SOD (Segregation Of Duties) check is performed manually by Process Owner/ Application owners as per the IT Policy.
Access to the system is provided automatically by use of auto-provisioning script designed as per the company IT policy.

BOT checks for SOD role conflict and provides the information to the Process Owner/Application owners as per the policy.

Once the approver rejects the approval request, no access is provided by BOT to the user in the system and audit logs are maintained for Compliance purpose.
Unauthorized users are granted privileged rights. Key financial data/programs are intentionally or unintentionally modified. Privileged access, including administrator accounts and superuser accounts, are appropriately restricted from accessing the software. Access to the system is provided manually by the IT team based on the given approval as per the IT policy.

Manual validation check and approval to be provided by Process Owner/ Application owners on restricted access to the system as per IT company policy.
Access to the system is provided automatically by use of auto-provisioning script designed as per the company IT policy.

Once the approver rejects the approval request, no access is provided by BOT to the user in the system and audit logs are maintained for Compliance purpose.

BOT can limit the count and time restriction of access to the system based on the configuration.
Unauthorized users are granted access to applicable logical access layers. Key financial data/programs are intentionally or unintentionally modified. Access requests to the application are properly reviewed and authorized by management User Access reports need to be extracted manually for access review by use of tools or help of IT.

Review comments need to be provided to IT for de-provisioning of access.
BOT can help the reviewer to extract the system generated report on the user.

BOT can help to compare active user listing with HR termination listing to identify terminated user.

BOT can be configured to de-provision access of user identified in the review report on unauthorized access.
Unauthorized users are granted access to applicable logical access layers if not timely removed. Terminated application user access rights are removed on a timely basis. System access is de-activated manually by IT team based on the approval provided as per the IT policy. System access can be deactivated by use of auto-provisioning script designed as per the company IT policy.

BOT can be configured to check the termination date of the user and de-active system access if SSO is enabled.

BOT can be configured to deactivate user access to the system based on approval.

The table provides a detailed comparison of the manual and automated approach. Automation can bring in 40-50% cost, reliability, and efficiency gains.

Conclusion
SOX compliance is a critical aspect of ensuring the integrity and transparency of financial reporting. By leveraging automation using RLCatalyst and RPA solutions from Relevance Lab, organizations can streamline their SOX compliance processes, reduce manual effort, and mitigate compliance risks. The combination of RLCatalyst’s automation capabilities and RPA solutions provides a comprehensive approach to achieving SOX compliance more efficiently and cost-effectively. The blog was enhanced using our own GenAI Bot to assist in creation.

For more details or enquires, please write to marketing@relevancelab.com

References
What is Compliance as Code?
What is SOX Compliance? 2023 Requirements, Controls and More
Building Bot Boundaries: RPA Controls in SOX Systems
Get Started with Building Your Automation Factory for Cloud
Compliance Requirements for Enterprise Automation (uipath.com)
Automating Compliance Audits|Automation Anywhere



0

2023 Blog, Blog, Featured

With the rise of Artificial intelligence (AI), many enterprises and existing customers are looking into ways to leverage this technology for their own development purposes and use cases. The field is rapidly attracting investments and efforts for adoption in an iterative manner starting with simple use cases to more complex business problems. In working with early customer, we have found the following themes as the first use cases for GenAI adoption in enterprise context:

  • Interactive Chatbots for simple Questions & Answers 
  • Enhanced Search with Natural Language Processing (NLP) using Document Repositories with data controls
  • Summarization of Enterprise Documents and Expert Advisor Tools

While OpenAI provides a model to build solutions, a number of early adopters are preferring use of Microsoft Azure OpenAI Service for better enterprise features.

Microsoft’s Azure OpenAI Service provides REST API access to OpenAI’s powerful language models the GPT-3.5, and Embeddings model series. Rest API is one of the ways to connect but Azure also provides .NET, Java, Javascript & Azure CLI to communication.

Introduction to GenAI

  1. What is Generative AI?
    • A class of artificial intelligence systems that can create new content, such as images, text, or videos, resembling human-generated data by learning patterns from existing data.
  2. What is the purpose?
    • To create new content or generate responses that are not based on predefined templates or fixed responses.
  3. How does it work?
    • Data is collected through various methods like scrapping and/or read documents/directories or indexes, then data is preprocessed to clean and format it for analysis. AI models, such as machine learning and deep learning algorithms, are trained on this preprocessed data to make predictions or classifications. By learning patterns from existing data and using that knowledge to produce new, original content through models.
  4. How can it be used by enterprises?
    • To assist end users (internal or external) in the form of next generation Chatbots.
    • To assist stakeholders with automating certain internal content creation processes.

Early Customer Adoption Experience
Customers wanted to experience GenAI for building awareness, validation of early use cases, and “testing the waters” with enterprise-grade security and governance for GenAI technology.

Early Use Cases Identified for Development
The primary focus area was in the content management space for enterprise data with focus on the following:

  1. End User Assistance (Chatbot)
    • Product Website Chatbot
    • Intranet Chatbot
  2. Content Creation
    • Document Summarization
    • Template based Document Generation
  3. SharePoint
    • Optical Character Recognition (OCR)
    • Cognitive Search
  4. Decision-making & Insights

Key Considerations for GenAI Leverage

  1. Limitations on current Chatbots
    • OCR
    • Closed chatbot allowing selection of pre-populated options
    • Limited scope and intelligence of responses
  2. Benefits expected from GenAI enhanced Chatbots
    • OCR
    • Human like responses
    • Ability to adapt quickly to new information
    • Multi-lingual
    • Restricts available data that Chatbot can draw from to verified Enterprise sites
  3. Potential Concerns
    • Can contain biases unintentionally learned by the model
    • Potential for errors and hallucinations

System Architecture
The system architecture using Azure Open AI takes advantage of several services provided by Azure.



The architecture may include the following components:

Azure Open AI Services
Azure Open AI Service is a comprehensive suite of AI-powered services and tools provided by Microsoft Azure. It offers a wide range of capabilities, including natural language processing, speech recognition, computer vision, and machine learning. With Azure Open AI Service, developers can easily integrate powerful AI models and APIs into their applications, enabling them to build intelligent and transformative solutions.

Azure Cognitive Services
Azure Cognitive Services offers a range of AI capabilities that can enhance Chatbot interactions. Services like Speech Services, Search Service, Vision Services and Knowledge Mining can be integrated to enable natural language understanding, speech recognition, and knowledge extraction.

Azure Storage
Azure Storage is a highly scalable and secure cloud storage solution offered by Microsoft Azure. It provides durable and highly available storage for various types of data, including files, blobs, queues, and tables. Azure Storage offers flexible options for storing and retrieving data, with built-in redundancy and encryption features to ensure data protection. It is a fundamental building block for storing and managing data in cloud-based applications.

Form Recognizer
Form Recognizer is a service provided by Azure Cognitive Services that uses machine learning to automatically extract information from structured and unstructured forms and documents. By analyzing documents such as invoices, receipts, or contracts, Form Recognizer can identify key fields and extract relevant data. This makes it easier to process and analyze large volumes of documents. It simplifies data entry and enables organizations to automate document processing workflows.

Service Account
A new service account would be required for team to establish connection with Azure services programmatically. The service account will need elevated privileges as needed for APIs to communicate with Azure services.

Azure API Management
Azure API Management provides a robust solution to address hurdles like throttling and monitoring. It facilitates the secure exposure of Azure OpenAI endpoints, ensuring their safeguarding, expeditiousness, and observability. Furthermore, it offers comprehensive support for the exploration, integration, and utilization of these APIs by both internal and external users.

Typical Interaction Steps between Components
The diagram below shows the typical interaction steps between different components.



  1. Microsoft Cognitive Search Engine indexes content from Document Repository as an Async Process.
  2. Using Frontend Application, the user interacts and sends query on Chatbot.
  3. The Azure API forwards the query to GPT Text Model that transforms the user query to an optimized Search Input.
  4. GPT Text Model returns this optimized Search Input to Azure API Orchestration Layer.
  5. API Layer sends Search Query to Cognitive Search.
  6. Cognitive Search returns the Relevant Content.
  7. API Layer sends the result from Cognitive Search with other details like Prompt, Chat context and history to GenAI for Response Generation.
  8. Generated and Summarized content is returned from GenAI.
  9. The meaningful results are shared back to user.

The above interactions clearly demonstrate that in the above architecture the documents remain inside the secure Azure network and are managed by Search engine. This ensured that the raw content is not being shared with OpenAI layer hence providing a controlled governance for data security and privacy.

Summary
Relevance Lab is working with early customers for GenAI Adoption using our AI Compass Framework. The customers’ needs vary from initial concept understanding to deploying with enterprise-grade guardrails and data privacy controls. Relevance Lab has already worked on 20+ GenAI BOTs across different architectures leveraging different LLM Models and Cloud providers with a reusable AI Compass Orchestration solution.

To know more about how we can help you adopt GenAI solutions “The Right Way” feel free to write to us at marketing@relevancelab.com and for a demonstration of the solution at AICompass@relevancelab.com

References
Revolutionize your Enterprise Data with ChatGPT
Augmenting Large Language Models with Verified Information Sources: Leveraging AWS
AWS SageMaker and OpenSearch for Knowledge-Driven Question Answering
What’s Azure Cognitive Search?



0

2023 Blog, AIOps Blog, Blog, Featured

As part of growing interest and attention on GenAI market trends, the priorities for enterprises in 2023 have rapidly shifted from tracking the trends to tremendous pressure of adopting this disruptive technology. While the interest is very high, most enterprises are grappling with the challenge on where to start and what best approach to use. Investments from CIO budgets are being quickly carved out, but the basic dilemma remains on early use cases, security & privacy issues with enterprise data and which platforms & tools to leverage. Relevance Lab has launched an “AI Taskforce” that covers key internal participants and customer advisory teams for this innovation. The primary focus is to define core and priority themes relevant for business and customers based on current assessment. This is an emerging space with a lot of global investments and innovation expected to drive major disruption in the next decade. We believe that requires an iterative model for strategy and an agile approach with focused concept incubations to work in close collaboration with our customers.

Customer Needs for AI Adoption
The most common ask from all customers is using GenAI for their business with the primary goals around the following business objectives:

  • Enhancing their end customer experience and business outcomes.
  • Saving costs with better efficiency leveraging the new AI models & interaction channels.
  • Improving their core Products & Offerings with AI to ensure the business does not get disrupted or irrelevant against competition.

The figure below captures the summary of customer asks, common business problems, and categories of solutions being explored.



Translating the above objectives to meaningful and actionable pursuits require focusing on key friction points and leveraging the power of AI. Some common use cases we have encountered are following:


  • Increasing online-user purchases and conversions by 20% with personalized customer experiences.
  • Better revenue realization with Dynamic Pricing and Propensity analysis.
  • Lesser subscription renewal loss with early detection & engagement with 90%+ Predictability.
  • Wastage reduction (by 10M US$ annual) for global pharma with AI-Led Optimization Algorithms.
  • Better price realization for procurement by 15%+ Anomaly Detection in Plan Purchase Analytics.
  • Better information aggregation and curation for mortgages with Machine Learning (ML) classifications.

Following are early initiatives being taken for our customers leveraging GenAI:

  • Pharma Product Reviews Summary and Advisor with GenAI.
  • Deployment of Private Foundation Models and training with custom data & business rules for Advisory services in Financial Services.
  • Use of Chatbots for easier user and customer support for Media customers.
  • Access to Business Dashboards with Generative Models using prompts for E-Commerce customers in Retail.
  • Increasing productivity of Development and Testing efforts with GenAI specialized tools for Technology ISVs.

There is no doubt that the momentum of such early technology adoption is growing everyday. This needs a structured program for collaboration with our customers to look for common building blocks and rapid models’ creation, training, deployment, interactions, and fine-tuning.

Relevance Lab AI Compass Framework
We have launched the “Relevance Lab AI Compass Framework” to guide and collaborate with customers in defining the early areas of focus in building solutions leveraging AI. The goal is to have this as a prescriptive model helping jumpstart the adoption of Enterprise and GenAI “The Right Way”. The figure below explains the same.

The AI Compass Framework takes a 360 degrees perspective on assessment of AI needs for an enterprise across the following pillars.

  • Product Engineering – building products that embed the power of AI
  • Business Data Decisions enhanced with AI
  • Machine Data Analysis enhanced with AI
  • Using GenAI for Business
  • Platform AI Competences – choosing the right foundation
  • Cloud AI Services – leveraging the best of breed
  • Digital Content with GenAI
  • Robotic Proces Automation enhanced with AI and Intelligent Document Generation
  • Preparing Enterprise Workforce – Training with AI
  • Managed Services & Support made more efficient & cost effective
  • Improving internal Tester and Quality Productivity with AI Tools
  • Developer Productivity enhancements with AI Tools


Relevance Lab is getting deeper into the above pillars and building the right design patterns for guiding our customers on “The Right Way” for enterprise adoption. The plan is also to build a foundation AI applications platform that will speed up adoption for end customers, saving them time, effort and with quality deliverables.

Product Engineering with AI 
This pillar focuses on how to make AI Architecture and Design patterns part of better Product Design. The charter is to find and recommend new architectures and integration with new GenAI models for making existing software products smarter with embedded AI techniques. We expect new products to adopt an “AI-First” approach to new developments. Every product in their focused vertical (Healthcare & Life Sciences, BFSI, Media & Communication, Technology) will need to embed AI into their core architecture.

Business Data Decisions with AI
This pillar defines AI-enhanced Data Engineering for common use cases and building blocks.  The traditional focus of AI initiatives has been on using primarily giving agile & actionable insights to the following: 

  • What happened in my business – this is Informative? 
  • What will happen – this is Predictive? 
  • What should be done – this is Prescriptive? 

The new dimension GenAI has added to the above is around “Generative” capabilities. Along with the need for building new features, there is growing adoption of popular data platforms like Databricks, Snowflake, Azure Data Factory, AWS Data Lake etc. that need to integrate with product specific AI enhancements.

Machine Data Analysis with AI
Customers already have focus on DevOps and AIOps with large data generated from Servers, Applications, Networks, Security, and Storage using different monitoring tools. However, there is a deluge of information and need of reducing noise and improving response times for effective operations support. This needs Alert Intelligence to reduce alert fatigue and incident intelligence to observe data across layers for faster issue diagnosis and fixes. Anomaly detection is a key need with time series data to look for odd patterns and flag risks for security, vulnerabilities, etc. While AIOps brings together the need for AI across Observability, Automation, and Service Delivery there are ways to leverage new GenAI tools for better Chatbots support in reducing operational costs and increasing efficiency. A common ask by customers is about the ability to predict a failure and prevent an outage in real-time with AI using these models. This requires design of Site Reliability Engineering (SRE) solutions to be more effective with AI techniques.

Like Infra and Apps intelligent observability with AI/ML Models, there is a growing need for Data Pipelines Observability with specialized models. With growing scale of ML Models, there is need to track drift across design, model and data for such pipelines with dashboards for visualization and actionable analytics.

Using GenAI for Business
One of the most common asks is to leverage ChatGPT APIs and suggest ways to leverage the disruptive technology for existing customer and internal needs. Leveraging this tool to reduce internal costs and improve external end customer experience with quick projects to define common use cases and how to get deeper with customer’s specific personal data and models.        

We are working with early adopter customers on how to prepare and leverage GenAI for their business problems across different verticals. All large enterprises have carved out special initiatives on “How to Use GenAI” and we offer a unique program to incubate these projects.

Platform AI Competencies 
These platforms are leading innovation and solutions for companies to build specialized applications leveraging AI in areas of Open Source LLMs (Large Language Models), OpenAI APIs, Reusable models library, TensorFlow, Hugging Face, Open-Source LangChain Library, Microsoft Orca, Databricks, etc. This pillar gets deep into specialized use cases for feature extraction, text classification, prompt engineering, Chatbots, Summarisation, Generative Writing, Ideation, Reinforced Learning etc.

Cloud AI Services 
With significant existing investments of customers on public cloud providers like AWS, Azure, and GCP there is a growing need for leveraging specialized AI offerings from these providers to jumpstart adoption with security and scalability in enterprise context. Also, there is a growing momentum of new GenAI solutions from these providers like AWS CodeWhisperer, Amazon Bedrock, Azure Synapse, Microsoft Responsible AI, and specialized tools & training from Google Cloud. The growing adoption will require deep understanding and support for MLOps and LLMOps for efficient and cost-effective operations. 

Digital Content with GenAI 
One of the biggest impacts with GenAI is the evolution of smarter search and information access across customers’ existing repositories of documents, FAQ, content platforms, product brochures etc. These cover all sorts of unstructured and semi-structured information. Customers are looking to leverage Public and Proprietary LLM (Large Language Models) Models with their personal data repositories and fine-tuned models of custom business rules. This requires customers to build, train, and deploy their own models with control on security and data privacy protected.

The right architectures will have a balance between different approaches of using standard models with enterprise data vs privately deployed models for enterprise content solutions.

Managed Services & Support AI 
Chatbots and GenAI can help improve the Support Lifecycle of Monitoring, ServiceDesk, TechOps, Desktop Support, User Onboarding/Offboarding. They can help in cost reduction and become more efficient in daily tasks.       

This aligns with customers focus on Managed Services, ServiceDesk, Command Centre, Technical Operations, and Security Ops. This pillar looks deeper into exploring AI techniques for Incident Intelligence, Chatbots, Automation, Self-Remediation, and Virtual Agents to be more productive and efficient. Relevance lab has leveraged “Automation-First” approach for greater productivity, effective operations & compliance.  

Robotic Process Automation with Intelligence
RPA (Robotic Process Automation) is bringing significant gains for business process automation in areas of repetitive & high frequency tasks along with better quality & compliances for use cases across different industries and corporate functions. With AI, a lot of additional benefits can be achieved for making business process frictionless. This pillar focuses on specialized use cases related to AI-Driven BOTs, Data & Documents Processing, Intelligent Decisioning by leveraging AI tools from key partners like UiPath & Automation Anywhere. 

Training with AI Technology
Companies are embarking on the goal to make all their employees AI skilled and certified. Leveraging AI tools in everybody’s day-to-day charter will improve the job efficiency. This requires setting up an AI-Lab for internal trainings and certifications. To create such a strong foundation, this pillar is looking into creating an AI Academy and have a program that drives “Self-Service Learning” and “Accreditation” based on a structured program.

Developer & Testing Productivity with AI Tools 
Adoption of AI and GenAI tools are key goals for smarter, faster, better outcomes. For testers, the specific areas of focus are around Automated Test Case Generation, Integration Test Generation, Security Co-Pilot, Performance Assessment, Simulated Data Gen. For developers, similar plans for boosting productivity using Developer Co-Pilot, Auto-Unit Tests, GenAI Code Assist, Compliance AI.

Co-Development Opportunities with Customers 
As part of expediting the innovation in this emerging area, we are launching a co-development program with early participants to build on use cases specific to customer verticals and domain needs. We have dedicated specialized teams working on deep GenAI and Enterprise AI skills and building re-usable components. We are offering a special six-week program for incubation and jumpstart of GenAI adoption by enterprises to build one specific use case.

To know more about how to collaborate and sharing your ideas for GenAI early adoption, contact us at AICompass@relevancelab.com

References
AI Foundation Model: Generative AI on AWS
Azure OpenAI on your Data
Google Generative AI Service Offerings Designed to get you up and Running Fast
Revolutionize your Enterprise Data with ChatGPT
A CIO and CTO Technology Guide to Generative AI



0

2023 Blog, Cloud Blog, Blog, Featured

Currently, all large enterprises are dealing with multi-cloud providers and the situation is more complicated where M&A has led to multiple organizations integrations and multiple vendors across Infrastructure, Digital, Enterprise Systems, and Collaboration tools bring their own Cloud footprints bundled with services. In this blog, we try to explain the common scenario being faced by large companies and how to create “The Right Way” to adopt a scalable Multi-Cloud Workload Planning and Governance Models.

Customer Needs
The customers facing such challenges usually share with us the following brief:

  • Assess existing workloads on AWS, Azure, GCP for basic health & maturity diagnostics.
  • Suggest a mature Cloud Management & Governance model for ensuring “The Right Way” to use the Cloud for multi-account, secure, and compliant best practices.
  • Recommend a model for future workloads migration and choice of cloud providers for optimal usage and ability to move new workloads to cloud easily.

Primary Business Drivers
Following are the key reasons for customers seeking Multi-Cloud Governance “The Right Way.”

  • Cost optimization and tracking for existing usage.
  • Ability to launch new regions/countries in cloud with easy and secure standardized processes.
  • Bring down cost of ownership on Cloud Assets – Infra/Apps/Managed Services with leverage of Automation and best practices.

Approach Taken
The basic approach followed for helping customers through the multi-cloud maturity models involves a PLAN-BUILD-RUN process as explained below:

Step-1: Planning & Assessment Phase
This involves working with customer teams to finalize the Architecture, Scope, Integration and Validation Needs for Cloud Assessment. The primary activities covered under this phase are following:

  • Coverage Analysis
    • Do a detailed analysis of all three Cloud Providers (AWS, Azure, GCP) and recommend what should be an ongoing strategy for Cloud Provider adoption.
  • Maturity Analysis
    • Do an assessment of current Cloud usage against industry best practices and share the maturity scorecard of customer setup.
  • Security Exposure
    • Find key gaps on security exposure and suggest ways for better governance.
  • Cost Assessment
    • Consolidation and cost optimization to have more efficient cloud adoption.

The foundation for analysis covers Cloud Provider specific analysis based on Well-Architected Frameworks as explained in the figure below:



Step-2: Build & Operationalize Phase
This primarily involves adoption of mature Cloud Governance360 and Well-Architected Models with best practices across key areas.

  • Accounts & Organization Units
  • Guardrails
  • Workloads Migration
  • Monitoring, Testing, Go-Live & Training
  • Documentation, Basic Automation for Infrastructure as Code
  • SAML Integration

The playbook for Build & Operationalize phase is based on Relevance Lab prescriptive model for using Cloud “The Right Way” as explained in the figure below.



Step-3: Ongoing Managed Services Run Phase
Post go-live on-going managed services ensure that the best practices created as part of foundation are implemented and “Automation-First” approach is used for Infrastructure, Governance, Security, Cost Tracking and Proactive Monitoring. Common activities under Run phase cover regular tasks a snapshot of what is provided below:

Daily Activities:

  • Monitoring & Reporting – App & Infra using CloudWatch – Availability, CPU, Memory, Disk Space, Security blocked requests details, Cost using Cost Explorer.
  • Alert acknowledgement and Incident handling.
  • Publish daily report.

Weekly Activities:

  • Check Scan Reports for most recent critical vulnerabilities.
  • Monitor Security Hub for any new critical non-compliances.
  • Plan of action to address the same.

Monthly Activities:

  • Patch Management.
  • Budgets Vs Costs Report.
  • Clean-up of stale/inactive users/accounts.
  • Monthly Metrics.

ServiceOne framework from Relevance Lab provides a mature Managed Services Model.

Sample Assessment Report
The analysis is done across 4 key areas as covered under Plan phase and explained below.

  • Cloud Provider Specific Analysis
    • Workload distribution analysis across all three providers, also mapped to 50+ different Best Practices Questionnaire.
  • 5-Pillars Well-Architected Analysis
    • Architecture & Performance Efficiency, Cost Optimization, Reliability & DR, Operational Excellence & Standardization, Security.
    • Global workloads analyzed across all different environments.
  • Security Findings
    • Identified Environments on Azure with significant exposure that needs fix.
    • Also suggested AWS Security Hub for formal scorecard and specific steps for maturity.
  • Cost Optimization
    • Analyzed costs across Environments, Workloads, and Apps.

Based on the above a final Assessment report is created with recommendations to fix immediate issues while also addressing medium term changes for ongoing maturity. The figure below shows a sample assessment report.



Summary
Relevance Lab is a specialist company in cloud adoption and workload planning. Working with 50+ customers on multiple engagements, we have created a mature framework for Multi-Cloud Workload and Governance Assessment. It is built on the foundation of best practices for Cloud Adoption Framework (CAF) and Well-Architected Frameworks (WAF) but enhanced with specific learnings and accelerators based on Goverenance360 and ServiceOne offerings to speed up a transition from un-managed & ad-hoc models to “The Right Way” of multi-cloud foundation.

To know more on how we can help feel free to contact us at marketing@relevancelab.com

References
AWS Well-Architected
Microsoft Azure Well-Architected Framework
Google Cloud Architecture Framework
AWS Cloud Adoption Framework (AWS CAF)
Microsoft Cloud Adoption Framework for Azure
Google Cloud Adoption Framework



0

2023 Blog, BOTs Blog, Blog, Featured

Relevance Lab is an Automation specialist company providing BOTs and Platforms for Business Processes, Applications, and Infrastructure. Our solutions leverage leading RPA (Robotic Process Automation) tools like UiPath, Automation Anywhere & Blue Prism. We provide re-usable templates for common use cases across Finance & Accounting, HR, IT, and Sales process automation.

By leveraging our robotic process automation services, our clients have realized:

  • 60-80% cost savings
  • 2-3x increase in process speed
  • 35-50% increase in employee productivity
  • Upto 30% FTE (Full Time Equivalent Headcount Reduction)

The biggest challenge in adoption of RPA for our customers primarily comes in identifying “where to start” dilemma. To help identify “what can be automated” we have designed the following guidelines to help with initial use cases for implementation:

  • High frequency and volume workflows
  • High complexity processes
  • High error prone and human task quality related areas
  • Domains with compliance needs with benefits of automated outcomes

Using these broad guidelines across a set of corporate functions we have commonly encountered the following use cases for RPA.

Finance & Accounting Automation

  • Stock Price Update
  • Purchase Order Process
  • Reconciliation Process
  • Payment Process
  • Financial and Loan Origination Process
  • Lease Accounting Process
  • Journal Process
  • Inventory Control Process
  • Error Audit Process
  • Invoice Process

Human Resources (HR) Automation

  • New Hire Onboarding Process
  • Data Approval Process
  • The Policy Processing (TPP)
  • Off-boarding Process
  • Legacy (AS/400) Process
  • Document Handling
  • Employee/HR/IT Process
  • User and Workspace- Employee/Contractor Offboarding
  • Back to Office (COVID) workflow automation and compliances

Infrastructure (IT) Management Automation

  • Distribution List Process
  • User Account Re-conciliation Process
  • Mailbox Automation & Reconciliation Process
  • User Migration & Access Control Verification Process
  • Logs Capture

Sales Automation

  • Contract Data Extraction
  • Sales Reporting
  • Sales Reconciliation Process
  • Material Edits Adjustments

With our comprehensive suite of RPA services, we have not only helped businesses adopt, but also maximize their investments in RPA.

The figure below explains the RPA Top Use Cases solved by Relevance Lab.



RL RPA Offerings
RPA Consulting/Assessment: RPA consulting and assessment is the process of evaluating organization’s processes and identifying opportunities for automation. It is essential for ensuring that RPA implementation is successful.

RPA Implementation: RPA implementation is the process of deploying and using RPA bots to automate processes. It is essential for realizing the benefits of RPA.

Automation Design: Automation design is the process of designing and implementing automation solutions. It involves understanding the business needs, identifying the processes that are suitable for automation, and designing and implementing the automation solutions.

Automation Support: Automation support is the process of providing support to users of automation solutions. It involves providing help with troubleshooting problems, resolving issues, and providing training on how to use the automation solutions.

The figure below explains our core offerings.



Relevance Lab “Automation-First” RPA Platform Architecture
Applications under Robotic Process Execution
RPA is well suited for enterprises and enterprise applications like ERP solutions (For example, SAP, Siebel, or massive data processing or records processing applications like Mainframes). Most of these applications are data-centric and also data-intensive with loads and loads of setup and repetitive process activities.

RPA Tools

  • It has the ability to automate any type of application in any environment.
  • Develop software robots that understand recordings, configuring, and enhancing these with programming logic.
  • Build reusable components which can further be applied to multiple robots, ensuring modularity and faster development and at the same time easier maintenance.

RPA Platforms
Ability to develop meaningful analytics about robots and their execution statistics.

RPA BOT Workbench
RPA execution infrastructure can sometimes be a bank of parallel physical or virtual lab machines, which can be controlled based on usage patterns. Scaling up or down the number of machines in parallel to achieve the task of automation can also be done, and this can be left unattended for as long as you like (as this requires no further human interaction or intervention).

The figure below explains the Relevance Lab “Automation-First” RPA Platform Architecture.



How to get started for new customers?

  • Reach out to Relevance Lab (write to marketing@relevancelab.com) for a quick discussion and demonstration of the standard solution
  • We will study the processes and help in identifying repetitive and manual tasks
  • Engage in creation of POC while selecting the right RPA Tool
  • Customers with standard needs can get started with a new setup in 4-6 weeks
  • Relevance Lab will also provide on-going support and managed services


Summary
Relevance Lab Automation at a Glance

  • RL has been Automation Specialist since 2016 (7+ Years).
  • Implemented 30+ successful customer automation projects covering RPA lifecycle.
  • Globally has 60+ RPA specialists with 150+ certifications.
  • Automated over 100+ processes, which includes customized solutions for industries like across Healthcare, BFSI, Retail and Technology Services & Manufacturing.

References
CoE Manager|Automation Anywhere
Build Your Robotic Process Automation Center of Excellence (uipath.com)



0

2023 Blog, command blog, Research Gateway, Blog, Featured

Secure Research Environments (SRE) provide researchers with timely and secure access to sensitive research data, computation systems, and common analytics tools for speeding up Scientific Research in the cloud. Researchers are given access to approved data, enabling them to collaborate, analyze data, share results within proper controls and audit trails. Research Gateway provides this secure data platform with the analytical and orchestration tools to support researchers in conducting their work. Their results can then be exported safely, with proper workflows for submission reviews and approvals.

The Secure Research Environments build on the original concept of Trusted Research Environment defined by UK NHS and uses the five safes framework for safe use of secure data. The five elements of the framework are:

  • Safe people
  • Safe projects
  • Safe settings
  • Safe data
  • Safe outputs

There are the following key building blocks for the solution:

  • Data Ingress/Egress
  • Researcher Workflows & Collaborations with costs controls
  • On-going Researcher Tools Updates
  • Software Patching & Security Upgrades
  • Healthcare (or other sensitive) Data Compliances
  • Security Monitoring, Audit Trail, Budget Controls, User Access & Management

The figure below shows implementation of SRE solution with Research Gateway.



The basic concept is to design a secure data enclave in which there is no ability to transfer data into or out of without going through pre-defined workflows. Within the enclave itself any amount or type of storage/compute/tools can be provisioned to fit the researcher’s needs. There is capability to use common research data and also bring in specific data by researchers.

The core functionality for SRE deals with solutions for the following:
Data Management and Preparation
This deals with “data ingress management” from both public and private sources for research. There are functionalities dealing with data ingestion, extraction, processing, cleansing, and data catalogs.

Study Preparation
Depending on the type of study and participants from different institutions, secure data enclave allows for study specific data preparation, allocation, access management and assignment to specific projects.

Secure Research Environment
A controlled cloud environment is provided for researchers to access the study data in a secure manner with no direct ingress-egress capability and conduct research using common tools like JupyterLab, RStudio, VSCode etc. for both interactive and batch processing. The shared study data is pre-mounted on research workspaces making it easy for researchers to focus on analysis without getting into complexity of infrastructure, tools and costs.

Secure Egress Approvals for Results Sharing
Post research if researchers want to extract results from the secure research environment, a specialized workflow is provided for request, review, approvals, and download of data with compliance and audit trails.

The SRE Architecture provides for Secure Ingress and Egress controls as explained in the figure below.



Building Block Detailed Steps
Data Management
  • Project Administrator creates the Data Library and research projects.
  • Project Administrator selects the Data Library project.
    • Sets up Study Bucket.
    • Creates the sub-folders to hold data.
    • Sets up an Ingress bucket for each researcher to bring in his own data.
    • Shares this with the researcher.
  • Project Administrator selects the Study screen.
    • Creates an internal study for each dataset and assign to the corresponding Secure Research project.
    • Creates internal study for each ingress bucket.
  • Project Administrator assigns the researchers to the corresponding secure projects.
Secure Research Environments
  • Researcher logs in.
  • Research uploads own data to ingress bucket.
  • Researcher creates a workspace (secure research desktop).
  • Researcher connects to workspace.
  • Researcher runs code and generates output.
  • Researcher copies output to egress store.
  • Researcher submits and egress request from the portal.
Egress Application
  • Information Governance lead logs in to Egress portal.
  • IG Lead approves request.
  • Project administrator logs in to portal.
  • Project administrator approves the request.
  • IG Lead logs in and downloads the file.

The need for Secure Research Enclave is a growing one across different countries. There is an emerging need for a consortium model, where multiple Data Producers and Consumers need to interact in a Secure Research Marketplace Model. The marketplace model is implemented on AWS Cloud and provides for tracking of costs and billing for all participants. The solution can be hosted by a third-party and provide Software as a Service (SaaS) model driving the key workflows for Data Producers and Data Consumers as explained in figure below.



Summary
Secure Research Environments are key features for enabling large institutions and governmental agencies to speed up research across different stakeholders leveraging the cloud. Relevance Lab provides a pre-built solution that can speed up the implementation of this large scale and complex deployment in a fast, secure, and cost-effective manner.

Here is a video demonstrating the solution.

To know more about this solution, feel free to write to marketing@relevancelab.com.

References
UK Health Data Research Alliance – Aligning approach to Trusted Research Environments
Trusted (and Productive) Research Environments for Safe Research
Deployment of Secure Data Environments on AWS
Microsoft Azure TRE Solution



0

2023 Blog, SWB Blog, Blog, Featured

Research computing is a growing need and AWS cloud enables researchers to process big data with scalable computing in a secure and flexible manner. While Cloud computing is a powerful platform it also brings complexity with new tools, nomenclature and multiple options that distract researchers. Relevance Lab is partnering with AWS Public sector group and some leading US universities to create a frictionless “Research Data Platform (RDP)” leveraging open-source solutions.

Service Workbench from AWS is a powerful open-source solution for enabling research in cloud. Customers around the globe are already using this solution for common use cases.

  • Enable researchers to use AWS Cloud with Self-service capabilities and common catalog of tools like EC2, SageMaker, S3, Studies data etc.
  • Use common Data Analysis tools like RStudio in a secure and scalable manner.
  • Setup a “Trusted Research Environment” in cloud for research with additional controls that enforce Ingress/Egress data restrictions for compliance.

While Service Workbench provides a good foundation platform for research, it also had some challenges based on feedback from early adopters mainly related to following:

  • Complex setup requiring deep cloud know-how.
  • An Admin centric User Experience not very Researcher friendly.
  • Scalability challenges while adopting large scale research setups.
  • Hard to customize.
  • No enterprise support models available to guide customer through a Plan-Build-Run lifecycle.

Relevance Lab has built a modern and researcher friendly User Experience solution called “Research Data Platform” in collaboration with AWS and its early adopters extending the open-source foundation.

Key Functionalities of Research Data Platform
The primary goal is to drive frictionless research in cloud with following key features:

  • Built as an open-source solution and made available to institutions interested in collaborating on a common Data Science Platform for research.
  • “Project Centric” model enabling collaboration of researchers with common data, tools, and research goals in a self-service manner.
  • Modern architecture with support for containers enabling researchers to bring their own tools covering Web-based software, Desktop-based tools, and Terminal-based solutions seamlessly accessed from Researcher Data Platform.
  • Enable researchers to launch applications and choose configurations without knowledge of Cloud Infrastructure details for both regular and GPU workloads.
  • Integrate with Datasets for research that are project centric and with a browser based easy interface to upload/download data for research.
  • Ability to run multiple research projects across different AWS accounts with secure and scalable setup and guardrails.

The key functions flows needed for a Researcher are explained in the figure below:



Here is link for a demo of the solution.

Solution Architecture of Research Data Platform
The building blocks for the solution leverage the Service Workbench functionality and creates a separate Researcher Data Platform (RDP) layer for providing a UI driven application to Researchers roles and Admin users. The figure below captures the building blocks for this solution.



The solution consists of the following components:

  • Webserver that serves the UI for the platform. The UI provides the entire researcher user experience whereby users can log in with their credentials and access the projects made available to them. Within the projects, users can launch applications that have been configured for them by the administrator. Users can choose the required configuration of the instances based on configurations created by the administrator.
  • Research Data Platform DB. This database stores some of the configuration information and the mapping information required to faciliate the use of the underlying “Service Workbench” open-source software.
  • Research Data Platform CLI. This command line interface allows the administrator to set up and configure projects, users, datasets, launchers and configurations easily.
  • Service Workbench. This open-source software from AWS is the underlying API-driven engine that orchestrates and manages all the AWS resources on behalf of the user.

Deployment Architecture of Research Data Platform
The solution is deployed in an enterprise model for each customer in their AWS accounts and recommends the following architecture based on AWS Well Architected Framework as explained in figure below.



The deployment of the Research Data Platform consists of the following:

  • One “Main” AWS account where RDP is deployed along with the Service workbench from AWS.
  • Within the main account, Service Workbench is deployed as a serverless solution driven by APIs. It stores data in a DynamoDB database and uses AWS Service Catalog to manage and orchestrate resources. It uses Amazon S3 to create buckets that hold data.
  • Within the main account, the Research Data Platform is deployed as a web server that serves the UI, along with an API backend that communicates with the Service Workbench.
  • One or more project accounts are onboarded and can be used to create projects and access datasets.

Sample Screens for Research Data Platform
The key functionality for the solution is explained in some sample screens below.

Home Page: This is the first page that the user visits. From this page the user can choose to login to the Research Data Platform.



Projects Page: The projects page displays a card view of all the projects that the logged-in user is assigned to. Projects are set up by the administrator.



Each application that is useful to a researcher is set up as a launcher. Each launcher appears on the project workbench page as a card and the researcher can instantiate a session by clicking on the launcher card.



Files tab: This screen allows the researcher to browse the files in the datasets that are assigned to the project. A default storage area called project storage is available in every project. The project storage can also be browsed from this screen.



Launch Dialog: The user can select a configuration that is suitable for their research.



Project Details: The user can connect to Active sessions from the Workbench tab.



Sessions: An instance of a launcher is called a session. A user can connect to a session via the browser to access the application they need for conducting their research and analysis.


How Can New Customers Get Started?

  • Reach out to Relevance Lab (write to rlcatalyst@relevancelab.com) for a quick discussion and demonstration of the standard solution
  • We will capture an assessment of standard features vs know gaps for adopting the solution
  • Engage on a Plan-Build-Run model based on deployment, enablement and operational readiness to start using Research in AWS cloud with simple and secure best practices
  • Customers with standard needs can get started with a new setup in 8-10 weeks
  • Relevance Lab will also provide on-going support and managed services

Conclusion
The Research Data Platform offers a comprehensive and researcher-friendly solution. It empowers researchers to process big data, perform data analysis, and conduct research efficiently in a secure and scalable manner. By bridging the gap between researchers and the AWS cloud, the RDP fosters innovation and advances scientific discovery in diverse domains.

References
Managing compute environments for researchers with Service Workbench on AWS
Using AWS Cloud for Research
Five ways to use AWS for research (starting right now)



0

2023 Blog, AWS Platform, Blog, Featured, Feature Blog

Relevance Lab (RL) has been an AWS (Amazon Web Services) partner for more than a decade now. While the journey started as a Services Partner it has now extended and matured to a niche technology partner with multiple solutions being offered on AWS Marketplace.

Here is a Quick Snapshot of AWS Capabilities:

  • RL is involved in Plan-Build-Run lifecycle of Cloud adoption by enterprises over a multi-year transformation journey.
  • The approach to Cloud Adoption is built on some key best practices covering Automation-First Approach, DevOps, Governance360, and Application-Centric Site Reliability Engineering (SRE) focus.
  • In Cloud Managed Services we cover all aspects of DevOps, AIOps, SecOps and ServiceDesk Ops leveraging our Automation Platforms – RLCatalyst BOTs, Command Centre, ServiceOne.
  • Involved with 50+ Cloud engagements covering large scale (5000+ nodes, 15+ regions, 200+ apps, 5.0+M annual spends) setups and optimization.
  • Deep partnership with AWS and ServiceNow to bring end-to-end Governance360 covering Asset Management, CMDB, Vulnerability & Patch Management, SIEM/SOAR, Cost/Security/Compliance Dashboards.
  • Products created and deployed on AWS for Self-Service Cloud Portals and Purpose-built cloud solutions covering HPC (High Performance Computing), Containers, Service Catalog, Cost & Budget tracking, and Scientific Research workflows.
  • Our work and resources cover Cloud Infrastructure, Cloud Apps, Cloud Data and Cloud Service Delivery with 800+ cloud trained resources, 450+ Cloud specialists and 100+ certifications.
  • RL is global number one preferred partner for AWS as an ISV provider for Scientific Research Computing building solutions using AWS Open-Source solutions like Service Workbench.


Our unique positioning of Products + Services helps create platform-based offerings delivered as playbooks for digital transformation.

Our key focus areas in Cloud Offerings are the following:

  • Cloud Management & Governance
  • Full Lifecycle Automation and Self-Service Portals
  • Containers, Microservices, Well Architected Frameworks and Kubernetes
  • AIOps and Site Reliability Engineering

What Makes Us Different?

  • Automation-First approach across “Plan, Build & Run” Lifecycle helps customers use “Cloud the Right Way” focused on best practices like “Infrastructure as a Code” and “Compliance as a Code.”
  • RLCatalyst Products offer Enterprise Cloud Orchestration and Governance with a pre-built library of quick-starts, BOTs, Self-Service Cloud Portals, and Open-source solutions.
  • AWS + ServiceNow unique specialization leveraged to provide Intelligent Cloud Operations & managed services.
  • ServiceOne AIOps Platform covering workload migration, security, governance, CMDB, ITSM and DevOps.
  • Frictionless Digital Application modernization and Cloud Product Engineering services for native cloud architecture and competencies.
  • Open-Source Co-Development with AWS for Scientific Research Solutions (Higher Ed and Healthcare).
  • Agile Analytics with our Spectra Data platform that helps building Enterprise Data Lakes and Supply Chain analytics by with multiple ERP systems connectors.

Our Solutions Sweet Spot
Governance360
Built on AWS Control Services a prescriptive and automated maturity model for proper workload migration, governance, security, monitoring and Service Management.

RLCatalyst BOTS Automation Engine and ServiceOne
Product covering end-to-end automation with a library of 100+ pre-built BOTs. Intelligent user and workspaces onboarding and offboarding.

Research Gateway – Self Service Cloud Portals
Self-Service Cloud Portal for Scientific Research in Cloud with HPC, Genomic Pipelines, covering EC2, SageMaker, S3 etc.

ServiceNow AppInsights built on AWS AppRegistry
Dynamic Applications CMDB leveraging AWS and ServiceNow with focus on Application Centric costs, health, and risks.

DevOps Driven Engineering and Cloud Product Development
DevOps-driven CI/CD, Infra Automation and Proactive Monitoring. AWS Well-architected. Cloud App Modernization, APM, API Gateways, Cloud Integration with Enterprise Systems. AWS Digital Customer Experience competencies

SPECTRA Data Platform for Cloud Data Lakes
Enterprise Data Lake with large data movement from on-prem to Cloud systems and ERP integration adapters for Supply Chain Analytics.

AWS Product Focus Areas
Control Tower, Security Hub, Service Catalog, HPC, Quantum Computing, Data Lake, ITSM Connectors, Well-Architected, SaaS (Software as a Service) Factory, Service Workbench, CloudEndure, AppStream 2.0, QuickStart for HIPPA, Bioinformatics

Focus on Software, Databases, Workloads
Open-source and App development stacks, Java, Python, MS .Net, Cloudera, Databricks, MongoDB, RedShift, Hadoop, Snowflake, Magento, WordPress, Moodle, RStudio, Nextflow


Key Verticals Solutions

  • Technology companies (ISVs & startups)
  • Media/Publishing/Higher Education/ Research
  • Pharma/Healthcare/Life Sciences
  • Financial and Insurance

The following are some Customer Solutions highlights:


Digital Publishing & Learning Specialist Cloud Migration, DevOps, Digital Platform Development covering Content, Commerce, E-Learning and CRM products, User Experience Designs, Cloud Arch, Data Cloud/BI, Sustaining, Perf testing, Automation
Global Pharma & Health Sciences Leader Data Analytics/Search Solutions leveraging Cloud & Big Data technologies. Enterprise Data Lake Analyzing ERP Data (SAP and others) to extract and load and associated cleansing, aggregation, data modelling and visualizations. Self Service Portal for AWS and Hybrid Cloud provisioning
Large Financial & Asset Mgmt. Firm Drive Cloud Adoption, App Modernization and DevOps models as part of IT Transformation journey leveraging their Cloud, Automation and Data Platforms.
Specialist Automation ISV Global partnership working across joint long-term engagements with multiple enterprise customers covering Infrastructure Automation, Application Deployment Automation, Compliance-as-a-Code and Hybrid Cloud Automation.

Summary
Relevance Lab has close collaboration and partnership with AWS for both products and competencies. We have been part of successful digital transformation with 50+ customers leveraging AWS across Infrastructure, Applications, Data Lakes, and Service Delivery Automation. We enable AWS Cloud adoption “The Right Way” with our comprehensive expertise and pre-built solutions better, faster, and cheaper.

Learn more about our cloud products, services, and solutions, feel free to contact us at marketing@relevancelab.com.

References
Get Dynamic Insights into Your Cloud with an Application-Centric View
Automation of User Onboarding and Offboarding Workflows



0

2023 Blog, BOTs Blog, RLCatalyst Blog, Blog, Featured

With growing interest & investments in new concepts like Automation and Artificial Intelligence, the common dilemma for enterprises is how to scale these for significant impacts to their relevant context. It is easy to do a small proof of concept but much harder to make broader impacts across the landscape of Hybrid Infrastructure, Applications and Service Delivery models. Even more complex is Organizational Change Management for underlying processes, culture and “Way of Working”. There is no “Silver bullet” or “cookie-cutter” approach that can give radical changes, but it requires an investment in a roadmap of changes across People, Process and Technology. RLCatalyst solution from Relevance Lab provides an Open Architecture approach to interconnect various systems, applications, and processes like the “Enterprise Service Bus” model.

What is Intelligent Automation?
The key building blocks of automation depend on the concept of BOTs. So, what are BOTs?


  • BOTs are automation codes managed by ASB orchestration
    • Infrastructure creation, updation, deletion
    • Application deployment lifecycle
    • Operational services, tasks, and workflows – Check, Act, Sensors
    • Interacting with Cloud and On-prem systems with integration adapters in a secure and auditable manner
    • Targeting any repetitive Operations tasks managed by humans that are frequent, complex (time-consuming), security/compliance related

  • What are types of BOTs?
    • Templates – CloudFormation, Terraform, Azure Resource Models, Service Catalog
    • Lambda functions, Scripts (PowerShell/python/shell scripts)
    • Chef/Puppet/Ansible configuration tools – Playbooks, Cookbooks, etc.
    • API Functions (local and remote invocation capability)
    • Workflows and state management
    • UIBOTs (with UiPath, etc.) and un-assisted non-UI BOTs
    • Custom orchestration layer with integration to Self-Service Portals and API Invocation
    • Governance BOTs with guardrails – preventive and corrective

  • What do BOTs have?
    • Infra as a code stored in source code configuration (GitHub, etc.)
    • Separation of Logic and Data
    • Managed Lifecycle (BOTs Manager and BOTs Executors) for lifecycle support and error handling
    • Intelligent Orchestration – Task, workflow, decisioning, AI/ML


To deploy BOTs across the enterprise and benefit from more sophisticated automation leveraging AI (Artificial Intelligence), RLCatalyst provides a prescriptive path to maturity as explained in the figure below.


ASB Approach
An Open- Architecture approach to interconnect various systems, applications, and processes similar to the “Enterprise Service Bus” model. This innovative approach of “software-defined” models, extendable meta-data for configurations, and a hybrid architecture takes into consideration modern distributed security needs. This ASB model helps to drive “Touchless Automation” with pre-built components and rapid adoption by existing enterprises.

To support a flexible deployment model that integrates with current SAAS (Software as a Service) based ITSM Platforms allows Automation to be managed securely inside Cloud or On-Premise data centers. The architecture supports a hybrid approach with multi-tenant components along with secure per instance-based BOT servers managing local security credentials. This comprehensive approach helps to scale Automation from silos to enterprise-wide benefits of human effort savings, faster velocity, better compliance and learning models for BOT efficiency improvements.


RLCatalyst provides solutions for enterprises to create their version of an Open Architecture based AIOps Platform that can integrate with their existing landscape and provide a roadmap for maturity.


  • RLCatalyst Command Centre “Integrates” with different monitoring solutions to create an Observe capability
  • RLCatalyst ServiceOne “Integrates” with ITSM solutions (ServiceNow and Freshdesk) for the Engage functionality
  • RLCatalyst BOTs Engine “Provides” a mature solution to “Design, Run, Orchestrate & Insights” for Act functionality

Relevance Lab is working closely with leading enterprises from different verticals of Digital Learning, Health Sciences & Financial Asset Management in creating a common “Open Platform” that helps bring Automation-First approach and a maturity model to incrementally make Automation more “Intelligent”.

For more information feel free to contact marketing@relevancelab.com

References
Get Started with Building Your Automation Factory for Cloud
Intelligent Automation For User And Workspace Onboarding
Intelligent Automation with AS/400 based Legacy Systems support using UiPath
RLCatalyst BOTs Service Management connector for ServiceNow


0

2023 Blog, Research Gateway, Blog, Featured

Major advances are happening with the leverage of Cloud Technologies and large Open Data sets in the areas of Healthcare informatics that include sub-disciplines like Bioinformatics and Clinical Informatics. This is being rapidly adopted by Life Sciences and Healthcare institutions in commercial and public sector space. This domain has deep investments in scientific research and data analytics focussing on information, computation needs, and data acquisition techniques to optimize the acquisition, storage, retrieval, obfuscation, and secure use of information in health and biomedicine for evidence-based medicine and disease management.

In recent years, genomics and genetic data have emerged as an innovative areas of research that could potentially transform healthcare. The emerging trends are for personalized medicine, or precision medicine leveraging genomics. Early diagnosis of a disease can significantly increase the chances of successful treatment, and genomics can detect a disease long before symptoms present themselves. Many diseases, including cancers, are caused by alterations in our genes. Genomics can identify these alterations and search for them using an ever-growing number of genetic tests.

With AWS, genomics customers can dedicate more time and resources to science, speeding time to insights, achieving breakthrough research faster, and bringing life-saving products to market. AWS enables customers to innovate by making genomics data more accessible and useful. AWS delivers the breadth and depth of services to reduce the time between sequencing and interpretation, with secure and frictionless collaboration capabilities across multi-modal datasets. Also, you can choose the right tool for the job to get the best cost and performance at a global scale— accelerating the modern study of genomics.

Relevance Lab Research@Scale Architecture Blueprint
Working closely with AWS Healthcare and Clinical Informatics teams, Relevance Lab is bringing a scalable, secure, and compliant solution for enterprises to pursue Research@Scale on Cloud for intramural and extramural needs. The diagram below shows the architecture blueprint for Research@Scale. The solution offered on the AWS platform covers technology, solutions, and integrated services to help large enterprises manage research across global locations.


Leveraging AWS Biotech Blueprint with our Research Gateway
Use case with AWS Biotech Blueprint that provides a Core template for deploying a preclinical, cloud-based research infrastructure and optional informatics software on AWS.

This Quick Start sets up the following:

  • A highly available architecture that spans two availability zones
  • A preclinical virtual private cloud (VPC) configured with public and private subnets according to AWS best practices to provide you with your own virtual network on AWS. This is where informatics and research applications will run
  • A management VPC configured with public and private subnets to support the future addition of IT-centric workloads such as active directory, security appliances, and virtual desktop interfaces
  • Redundant, managed NAT gateways to allow outbound internet access for resources in the private subnets
  • Certificate-based virtual private network (VPN) services through the use of AWS Client VPN endpoints
  • Private, split-horizon Domain Name System (DNS) with Amazon Route 53
  • Best-practice AWS Identity and Access Management (IAM) groups and policies based on the separation of duties, designed to follow the U.S. National Institute of Standards and Technology (NIST) guidelines
  • A set of automated checks and alerts to notify you when AWS Config detects insecure configurations
  • Account-level logging, audit, and storage mechanisms are designed to follow NIST guidelines
  • A secure way to remotely join the preclinical VPC network is by using the AWS Client VPN endpoint
  • A prepopulated set of AWS Systems Manager Parameter Store key/value pairs for common resource IDs
  • (Optional) An AWS Service Catalog portfolio of common informatics software that can be easily deployed into your preclinical VPC

Using the Quickstart templates, the products were added to AWS Service Catalog and imported into RLCatalyst Research Gateway.



Using the standard products, the Nextflow Workflow Orchestration engine was launched for Genomics pipeline analysis. Nextflow helps to create and orchestrate analysis workflows and AWS Batch to run the workflow processes.

Nextflow is an open-source workflow framework and domain-specific language (DSL) for Linux, developed by the Comparative Bioinformatics group at the Barcelona Centre for Genomic Regulation (CRG). The tool enables you to create complex, data-intensive workflow pipeline scripts, and simplifies the implementation and deployment of genomics analysis workflows in the cloud.

This Quick Start sets up the following environment in a preclinical VPC:

  • In the public subnet, an optional Jupyter notebook in Amazon SageMaker is integrated with an AWS Batch environment.
  • In the private application subnets, an AWS Batch compute environment for managing Nextflow job definitions and queues and for running Nextflow jobs. AWS Batch containers have Nextflow installed and configured in an Auto Scaling group.
  • Because there are no databases required for Nextflow, this Quick Start does not deploy anything into the private database (DB) subnets created by the Biotech Blueprint core Quick Start.
  • An Amazon Simple Storage Service (Amazon S3) bucket to store your Nextflow workflow scripts, input and output files, and working directory.

RStudio for Scientific Research
RStudio is a popular IDE, licensed either commercially or under AGPLv3, for working with R. RStudio is available in a desktop version or a server version that allows you to access R via a web browser.

After you’ve analyzed the results, you may want to visualize them. Shiny is a great R package, licensed either commercially or under AGPLv3, that you can use to create interactive dashboards. Shiny provides a web application framework for R. It turns your analyses into interactive web applications; no HTML, CSS, or JavaScript knowledge is required. Shiny Server can deliver your R visualization to your customers via a web browser and execute R functions, including database queries, in the background.

RStudio is provided as a standard catalog item in Research Gateway for 1-Click deployment and use. AWS provides a number of tools like AWS Athena, AWG Glue, and others to connect to datasets for research analysis.

Benefits of using AWS for Clinical Informatics

  • Data transfer and storage
  • The volume of genomics data poses challenges for transferring it from sequencers in a quick and controlled fashion, then finding storage resources that can accommodate the scale and performance at a price that is not cost-prohibitive. AWS enables researchers to manage large-scale data that has outpaced the capacity of on-premises infrastructure. By transferring data to the AWS Cloud, organizations can take advantage of high-throughput data ingestion, cost-effective storage options, secure access, and efficient searching to propel genomics research forward.

  • Workflow automation for secondary analysis
  • Genomics organizations can struggle with tracking the origins of data when performing secondary analyses and running reproducible and scalable workflows while minimizing IT overhead. AWS offers services for scalable, cost-effective data analysis and simplified orchestration for running and automating parallelizable workflows. Options for automating workflows enable reproducible research or clinical applications, while AWS native, partner (NVIDIA and DRAGEN), and open source solutions (Cromwell and Nextflow) provide flexible options for workflow orchestrators to help scale data analysis.

  • Data aggregation and governance
  • Successful genomics research and interpretation often depend on multiple, diverse, multi-modal datasets from large populations. AWS enables organizations to harmonize multi-omic datasets and govern robust data access controls and permissions across a global infrastructure to maintain data integrity as research involves more collaborators and stakeholders. AWS simplifies the ability to store, query, and analyze genomics data, and link with clinical information.

  • Interpretation and deep learning for tertiary analysis
  • Analysis requires integrated multi-modal datasets and knowledge bases, intensive computational power, big data analytics, and machine learning at scale, which, historically can take weeks or months, delaying time to insights. AWS accelerates the analysis of big genomics data by leveraging machine learning and high-performance computing. With AWS, researchers have access to greater computing efficiencies at scale, reproducible data processing, data integration capabilities to pull in multi-modal datasets, and public data for clinical annotation—all within a compliance-ready environment.

  • Clinical applications
  • There are several hindrances that impede the scale and adoption of genomics for clinical applications including speed of analysis, managing protected health information (PHI), and providing reproducible and interpretable results. By leveraging the capabilities of the AWS Cloud, organizations can establish a differentiated capability in genomics to advance their applications in precision medicine and patient practice. AWS services enable the use of genomics in the clinic by providing the data capture, compute, and storage capabilities needed to empower the modernized clinical lab to decrease the time to results, all while adhering to the most stringent patient privacy regulations.

  • Open datasets
  • As more life science researchers move to the cloud and develop cloud-native workflows, they bring reference datasets with them, often in their own personal buckets, leading to duplication, silos, and poor version documentation of commonly used datasets. The AWS Open Data Program (ODP) helps democratize data access by making it readily available in Amazon S3, providing the research community with a single documented source of truth. This increases study reproducibility, stimulates community collaboration, and reduces data duplication. The ODP also covers the cost of Amazon S3 storage, egress, and cross-region transfer for accepted datasets.

  • Cost optimization
  • Researchers utilize massive genomics datasets, which require large-scale storage options and powerful computational processing and can be cost-prohibitive. AWS presents cost-saving opportunities for genomics researchers across the data lifecycle—from storage to interpretation. AWS infrastructure and data services enable organizations to save time, money, and devote more resources to science.

Summary
Relevance Lab is a specialist AWS partner working closely in Health Informatics and Genomics solutions leveraging AWS existing solutions and complementing them with its Self-Service Cloud Portal solutions, automation, and governance best practices.

To know more about how we can help standardize, scale, and speed up Scientific Research in Cloud, feel free to contact us at marketing@relevancelab.com.

References
AWS Whitepaper on Genomics Data Transfer, Analytics and Machine Learning
Genomics Workflows on AWS
HPC on AWS Video – Running Genomics Workflows with Nextflow
Workflow Orchestration with Nextflow on AWS Cloud
Biotech Blueprint on AWS Cloud
Running R on AWS
Advanced Bioinformatics Workshop



0

PREVIOUS POSTSPage 1 of 2NO NEW POSTS