DIGIT Urban
PlatformDomainsAcademyDesign SystemFeedback
v2.1
v2.1
  • DIGIT Urban
  • What’s New in DIGIT Urban 2.1?
  • DIGIT Architecture
  • Product & Modules
    • Brochures
    • User Manuals
      • Logging Into DIGIT
      • Building Plan Approval (BPA)
        • Citizen User Manual
        • Employee User Manual
      • Trade License
        • Citizen User Manual
        • Employee User Manual
      • Public Grievance & Redressal
        • Citizen User Manual
        • Employee User Manual
        • Complaint Types List
      • Property Tax
        • Citizen User Manual
        • Employee User Manual
      • Water & Sewerage
        • Citizen User Manual
        • Employee User Manual
      • mCollect
        • Citizen User Manual
        • Employee User Manual
    • Services Overview
      • Core Services
        • Workflow Services
        • Location Services
        • User Services
        • Access Control Services
        • PDF Generation Service
        • MDMS (Master Data Management Service)
        • Payment Gateway Service
        • User Session Management In DIGIT
        • Indexer Service
        • URL Shortening Service
      • Business Service
      • Municipal Service
        • PGR Services 2.0
          • PGR Migration
        • Trade-License Service
      • Utilities
    • Release Notes DIGIT 2.1
      • BPA 2.1 Release Notes
      • Property Tax 2.1 Release Notes
      • PGR 2.1 Release Notes
      • Water & Sewerage 2.1 Release Notes
      • DSS 2.1 Release Notes
      • Configuration Changes
    • DIGIT Roadmap
    • Product FAQs
    • Quality Assurance
  • Configure DIGIT
    • Git Repos
    • Setting up DIGIT
      • Configuring InfraOps
      • Setting up DIGIT Environment
      • Email And SMS Setup
      • FileStore Setup
      • Setting Up SSL Certificate
      • Periodic Log Cleanup
    • Setting up Master Data
      • MDMS Overview
      • Configuring Tenants
      • Configuring Master Data
      • Adding New Master
      • State Level Vs City Level Master
    • Master Data Collection Templates
      • Environment Setup
        • State Level Setup
          • Tenants Information
          • SMS Account Configuration
          • Email Account Configuration
          • Google Play Store Account
          • Payment Gateway Configuration
          • POS Integration Configuration
          • Domain Name Configuration
          • SSL Configuration
          • ULB Departments
          • ULB Designations
          • Localization
          • Google Map Configuration
        • ULB Level Setup
          • Boundary Hierarchies
          • Boundary Data
          • Cross Hierarchy Mapping
          • ULB Bank Accounts
      • Module Setup
        • Trade Licenses Templates
          • Trade Category
          • Trade Type
          • Trade Sub Type
          • Trade License Fee
          • Trade License Documents Attachment
          • Structure Type
          • Structure Sub Type
        • Property Tax Data Templates
          • Road Type
          • Construction Type
          • Property Type
          • Property Sub Type
          • Usage Category Major
          • Usage Category Minor
          • Usage Category Sub Minor
          • Usage Category Detail
          • Ownership Category
          • Ownership Sub Category
          • Owner Special Category
          • Special Category Documents
          • Unit Rates
          • Tax Rates
          • Interest Rates
          • Penalty Rates
          • Rebate Rates
          • Mutation Fee
        • PGR Data Templates
          • Grievance Type
          • Grievance Sub Type
          • Routing Matrix
          • Escalation Matrix
        • Fire NOC Data Templates
          • Building Usage Type
          • Building Sub Usage Type
          • Fire Station Master
          • Areas Served Master
          • Fire Station Mapping
          • Fire NOC Fee
        • mCollect Data Templates
          • Service Category
          • Service Sub Category
          • Service Sub Category GL Code Mapping
        • Web Portals Templates
          • State Portal
          • ULB Portal
        • OBPAS Data Templates
          • List Of Services
          • Service-Wise Documents
          • Building Occupancy
          • Building Sub Occupancy
          • Building Usage
          • Inspection Checklist
          • Stakeholders Type
          • Town Planning Schemes
          • NOC Departments
          • Fee Structure
          • eDCR Drawing
        • HRMS Data Templates
          • User Roles
          • System Users
        • Finance Data Templates
          • Chart Of Accounts
          • Funds
          • Functions
          • Contractors
          • Suppliers
          • Schemes
          • Sub Schemes
          • Bank
          • Bank Branch
          • Bank Account
          • Deductions
          • Opening Balances
          • Sub Ledger Category
          • Sub Ledger Master
        • Water Charges Data Templates
          • Pipe Size Types
          • Water Source Types
          • Water Rates (Metered)
          • Water Rates (Non-Metered)
          • Water Penalty Rates
          • Water Interest Rates
        • Sewerage Charges Data Templates
          • Sewerage Rates
          • Sewerage Penalty Rates
          • Sewerage Interest Rates
        • Billing And Payments Data Templates
          • Tax Heads
          • Receipt Format
          • Demand Bill Format
        • DSS Data Templates
          • KPI Acceptance
        • Workflow Data Templates
          • Workflow Actions
          • Workflow Levels
          • Workflow Process
          • Workflow Notifications
        • Common Configuration Details
          • Standard Document List
          • Service Document Mapping
          • Checklist
          • Configuring Data FAQs
    • Configuring Workflows
      • Setting Up Workflows
      • Configuring Workflows For An Entity
    • Configuring Services
      • API Dos and Don'ts
      • Setting Up Service Locally
      • Configuring New Reports
        • Types Of Reports Used In Report Service
        • Impact Of Heavy Reports On Platform
      • Customizing PDF Notices And Certificates
        • Integration Of PDF In UI For Download And Print PDF
        • Customizing PDF Receipts & Certificates
    • Setting up a Language
      • Adding New Language
      • Setting Up Default Language For SMS & Emails
    • Configuring Localization
      • Setup Base Product Localization
      • Configure SMS and Email
    • Setting Up SMS Gateway
      • Using The Generic GET & POST SMS Gateway Interface
    • Configuration FAQs
    • Setting Up eDCR Service
    • Adding Roles To System
    • Mapping Roles With APIs
    • DSS Configuration And Setup
      • Building New Dashboards
    • Setting Up Finance Service
    • Adding New APIs For Access
    • Deployment Of App on Play Store
  • Customize DIGIT
    • Frontend/UI
    • DIGIT Customization
      • API Do's & Don'ts
      • Writing A New Customer
    • Services
      • Core Services
      • Business Services
      • Municipal Services
      • Infra Services
    • Master & Configuration data load kit
    • Data Migration
      • Data Migration Principles
      • Data Templates
      • Data Migration Kit
  • Deployment Tools
    • Setup DIGIT
      • Infra Requirements
      • Infra Best Practices
      • Operational Best practices
      • Why Kubernetes for DIGIT
      • Supported Clouds
        • Google Cloud
        • Azure
        • AWS
        • VSphere
        • SDC
        • NIC
      • Infra Sizing
      • Deployment Architecture
      • Deploy DIGIT
        • Routing Traffic
        • Backbone Deployment
    • Skills Needed
    • Resource Requests & Limits
    • Readiness & Liveness
    • Troubleshooting
      • Distributed Tracing
      • Logging
      • Monitoring & Alerts
    • CI/CD
    • Security Practices
  • DIGIT Training Materials
    • Training Calendar
    • Training Videos
  • DIGIT Support
    • eGov Enablement Support for DIGIT
    • Troubleshooting Guide
Powered by GitBook

​All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.

On this page
  • Overview
  • Pre-requisites
  • Key Functionalities
  • 1. Creating a DSS dashboard schema
  • Reference Docs
  • Doc Links

Was this helpful?

Edit on Git
Export as PDF
  1. Configure DIGIT

DSS Configuration And Setup

Overview

A decision support system (DSS) is a composite tool that collects, organizes, and analyzes business data to facilitate quality decision-making for management, operations, and planning. A well-designed DSS aids decision-makers in compiling a variety of data from many sources: raw data, documents, personal knowledge from employees, management, executives, and business models. DSS analysis helps organizations to identify and solve problems, and make decisions

This document explains the steps on how to define the configurations & set up for the new dashboard in the DSS.

Pre-requisites

Before you proceed with the configuration, make sure the following pre-requisites are met -

  • Prior Knowledge of Spring boot

  • Prior Knowledge of Kafka

  • Prior Knowledge of Elastic Search

  • Prior Knowledge of Kibana

  • Prior Knowledge of EQL (Elastic Query Language)

  • Prior Knowledge of JSON

Key Functionalities

  1. Creating a DSS dashboard schema

  2. DSS ingest service APIs

  3. Ingest service configurations

  4. Creating Kafka sync connector to push the data to Elastic search

1. Creating a DSS dashboard schema

When we are going to start indexing the DSS collection v2 index. We should create the schema in the ES using the Kibana query as there in the below file.

2. DSS ingest service API

curl -X POST \
http://dashboard-ingest.egov:8080/dashboard-ingest/ingest/migrate/paymentsindex-v1/v1 \
   -H 'Cache-Control: no-cache' \
   -H 'Content-Type: application/json' \
   -H 'Postman-Token: d380bebb-383f-1b7c-76d1-10c1dc07dd06' \
   -d '{
   "RequestInfo": {
     "apiId": "string",
     "ver": "string",
     "ts": null,
     "action": "string",
     "did": "string",
     "key": "string",
     "msgId": "string",
     "authToken": "b843ef27-1ac6-49b8-ab71-cd0c22f4e50e",
     "userInfo": {
       "id": 23299,
       "uuid":"e721639b-c095-40b3-86e2-acecb2cb6efb",
       "userName": "9337682030",
       "name": "Abhilash Seth",
       "type": "CITIZEN",
       "mobileNumber": "9337682030",
       "emailId": "abhilash.seth@gmail.com",
       "roles": [
         {
           "id": 281,
           "name": "Citizen"
         }
       ]
     }
   }
}'

3. Ingest service configurations

  • Transform collection schema for V2

    • This transform collection v1 configuration file is used to map with the incoming data. This mapped data will go inside the data object in the DSS collection v2 index.

  • Here: $i, the variable value that gets incremented for the number of records of paymentDetails.

    $j, the variable value that gets incremented for the number of records of billDetails.

  • Enrichment Domain Configuration

    • This configuration defines and directs the Enrichment Process which the data goes through.

    • For example, if the Data which is incoming is belonging to a Collection Module data, then the Collection Domain Config is picked. And based on the Business Type specified in the data, the right config is picked.

    • In order to enhance the data of Collection, the domain index specified in the configuration is queried with the right arguments and the response data is obtained, transformed and set.

  • Topic Context Configuration

    • Topic Context Configuration is an outline to define which data is received on which Kafka Topic.

    • Indexer Service and many other services are sending out data on different Kafka Topics. If the Ingest Service is asked to receive those data and pass it through the pipeline, the context and the version of the data being received has to be set. This configuration is used to identify as in which Kafka topic consumed the data and what is the mapping for that.

  • JOLT Domain Transformation Schema

    • JOLT is a JSON to JSON Transformation Library. In order to change the structure of the data and transform it in a generic way, JOLT has been used.

    • While the transformation schemas are written for each Data Context, the data is transformed against the schema to obtain transformed data.

  • Validator Schema

    • Validator Schema is a configuration Schema Library from Everit Bypassing the data against this schema, it ensures whether the data abides by the rules and requirements of the schema which has been defined.

  • Enhance Domain configuration

    • This configuration defines and directs the Enrichment Process which the data goes through.

    • For example, if the Data which is incoming is belonging to a Collection Module data, then the Collection Domain Config is picked. And based on the Business Type specified in the data, the right config is picked and the final data is placed inside the domain object.

    • In order to enhance the data of Collection, the domain index specified in the configuration is queried with the right arguments and the response data is obtained, transformed and set.

For Kafka connect to work, Ingest pipeline application properties or in environments direct push must be disabled.

es.push.direct=false

If DSS collection index data is indexing directly ( without Kafka connector) to ES through the ingest pipeline then, make the application properties or in environments, direct push must be enabled.

es.push.direct=true

4. Creating a Kafka sync connector to push the data to the Elasticsearch

  • Configure the Kafka topics in the environments or Ingest pipeline application properties as shown below.

  • To Start the indexing we will create a connecter that will take data from the topic and push it to the index we have mentioned in the "transforms.TopicNameRouter.replacement" and mention the ES host in the Kafka connection we have to mention the host URL in “connection.url”.

  • To create the Kafka connector run the below curl command inside the playground pod:

curl -X POST \
 http://kafka-connect.kafka-cluster:8083/connectors/ \
 -H 'Cache-Control: no-cache' \
 -H 'Content-Type: application/json' \
 -H 'Postman-Token: 419e68ba-ffb9-4da9-86e1-7ad5a4c8d0b9' \
 -d '{
     "name": "dss-collection_v2-es-sink",
     "config": {
     "connector.class":
"io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
     "type.name": "general",
     "tasks.max": "1",
     "max.retries": "15",
     "key.ignore": "false",
     "retry.backoff.ms": "5000",
     "max.buffered.records": "2000",
     "value.converter": "org.apache.kafka.connect.json.JsonConverter",
     "errors.log.enable": "true",
     "key.converter":
"org.apache.kafka.connect.storage.StringConverter",
     "read.timeout.ms": "10000",
     "topics": "dss-collection-update",
     "batch.size": "1000",
     "max.in.flight.requests": "2",
     "schema.ignore": "true",
     "behavior.on.malformed.documents": "warn",
     "flush.timeout.ms": "3600000",
     "errors.deadletterqueue.topic.name": "dss-collection_v2-failed",
     "errors.tolerance": "all",
     "value.converter.schemas.enable": "false",
     "name": "dss-collection_v2-es-sink",
     "connection.url": "http://elasticsearch-client-v1.es-cluster:9200",
     "linger.ms": "1000",
     "transforms": "TopicNameRouter",
     "transforms.TopicNameRouter.type":
"org.apache.kafka.connect.transforms.RegexRouter",
     "transforms.TopicNameRouter.regex": "dss-collection_v2*",
     "transforms.TopicNameRouter.replacement": "dss-collection_v2"
     }
}'
curl --location --request DELETE 'http://kafka-connect.kafka-cluster:8080/connectors/dss-collection_v2-es-sink'

Reference Docs

Doc Links

Title

Link

DSS Backend Configuration Manual

DSS Dashboard - Technical Document for UI

DSS Technical Documentation

PreviousMapping Roles With APIsNextBuilding New Dashboards

Last updated 4 years ago

Was this helpful?

Micro Service which runs as a pipeline and manages to validate, transform and enrich the incoming data and pushes the same to ElasticSearch Index. Ingest service will fetch the data from the index(paymentsindex-v1) which is specified in the indexing service API as below. The ingest service will read the configuration files which are there with v1. All the configuration files will be there .

here
Click here for an example configuration
Click here for an example configuration
Click here for an example configuration
Follow the slide deck for JOLT Transformations
Click here for an example configuration
Click here for an example configuration
Click here for an example configuration
DSS Backend Configuration Manual
https://digit-discuss.atlassian.net/wiki/spaces/EPE/pages/283017217/DSS+Dashboard+-+Technical+Document+for+UI
DSS Technical Documentation
87KB
dss-collection_v2-schema.txt
DSS Collection v2 Schema