Commit ec0853ab authored by vaisakh.nair's avatar vaisakh.nair 🎯

new data added

parent 0297bb70
Pipeline #53022 failed with stage
#Ignore the logs directory
logs/
#Ignoring the password file
passwords.txt
#Ignoring git and cache folders
.git
.cache
.gitignore
.gitlab-ci.yml
variables.yml
#Ignoring all the markdown and class files
*.md
**/*.class
.env
__pycache__
*.pyc
*.pyo
*.pyd
.Python
.env
pip-log.txt
pip-delete-this-directory.txt
.tox
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*,cover
*.log
\ No newline at end of file
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
*.pyc
*.iml
*.xml
*.patch
idea/
\ No newline at end of file
This diff is collapsed.
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
hooks:
- id: end-of-file-fixer
- id: trailing-whitespace
- id: requirements-txt-fixer
- repo: https://github.com/omnilib/ufmt
rev: v2.0.0
hooks:
- id: ufmt
additional_dependencies:
- black == 22.6.0
- usort == 1.0.4
- repo: https://github.com/PyCQA/flake8
rev: 5.0.4
hooks:
- id: flake8
args:
- "--max-line-length=120"
- "--max-complexity=20"
- "--select=B,C,E,F,W,T4,B9"
# these are errors that will be ignored by flake8
# check out their meaning here
# https://flake8.pycqa.org/en/latest/user/error-codes.html
- "--ignore=E203,E266,E501,W503,F403,F401,E402"
FROM python:3.9.10-slim
COPY requirements.txt /code/requirements.txt
WORKDIR /code
RUN pip install -r requirements.txt
COPY . /code
CMD [ "python", "app.py" ]
\ No newline at end of file
Release Note:
- version - v.6.9
Feature:
- Report Filter Support
- Back-fill enhancement Support
if __name__ == '__main__':
from dotenv import load_dotenv
load_dotenv()
import argparse
import gc
import uvicorn
from scripts.config.app_configurations import Service
from scripts.logging.logging import logger
gc.collect()
ap = argparse.ArgumentParser()
if __name__ == "__main__":
ap.add_argument(
"--port",
"-p",
required=False,
default=Service.PORT,
help="Port to start the application.",
)
ap.add_argument(
"--bind",
"-b",
required=False,
default=Service.HOST,
help="IP to start the application.",
)
arguments = vars(ap.parse_args())
logger.info(f"App Starting at {arguments['bind']}:{arguments['port']}")
uvicorn.run("main:app", host=arguments["bind"], port=int(arguments["port"]))
-----BEGIN RSA PRIVATE KEY-----
MIICWwIBAAKBgQClilTaeHq6Zc+kWHCNl1O0btGRm7ct3O5zqWx1mwwLUWH14eft
Hi5wIbOYh79JQ9BO2OA4UjPq31uwmJ96Okl0OULfENhwd/D7P3mnoRlktPT2t+tt
RRrKvx3wNpOy/3nBsXnNt8EKxyA7k9vbqLbv9pGw2hcqOYe/NGTkmm1PswIDAQAB
AoGAZPARR1l5NBkKYGKQ1rU0E+wSmx+AtVVmjF39RUSyNmB8Q+poebwSgsr58IKt
T6Yq6Tjyl0UAZTGmferCK0xJJrqyP0hMn4nNNut+acWMKyt+9YrA2FO+r5Jb9JuT
SK35xXnM4aZLGppgWJxRzctpIz+qkf6oLRSZme0AuiqcwYECQQDY+QDL3wbWplRW
bze0DsZRMkDAkNY5OCydvjte4SR/mmAzsrpNrS5NztWbaaQrefoPbsdYBPbd8rS7
C/s/0L1zAkEAw1EC5zt2STuhkcKLa/tL+bk8WHHHtf19aC9kBj1TvWBFh+JojWCo
86iK5fLcHzhyQx5Qi3E9LG2HvOWhS1iUwQJAKbEHHyWW2c4SLJ2oVXf1UYrXeGkc
UNhjclgobl3StpZCYAy60cwyNo9E6l0NR7FjhG2j7lzd1t4ZLkvqFmQU0wJATLPe
yQIwBLh3Te+xoxlQD+Tvzuf3/v9qpWSfClhBL4jEJYYDeynvj6iry3whd91J+hPI
m8o/tNfay5L+UcGawQJAAtbqQc7qidFq+KQYLnv5gPRYlX/vNM+sWstUAqvWdMze
JYUoTHKgiXnSZ4mizI6/ovsBOMJTb6o1OJCKQtYylw==
-----END RSA PRIVATE KEY-----
-----BEGIN PUBLIC KEY-----
MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQClilTaeHq6Zc+kWHCNl1O0btGR
m7ct3O5zqWx1mwwLUWH14eftHi5wIbOYh79JQ9BO2OA4UjPq31uwmJ96Okl0OULf
ENhwd/D7P3mnoRlktPT2t+ttRRrKvx3wNpOy/3nBsXnNt8EKxyA7k9vbqLbv9pGw
2hcqOYe/NGTkmm1PswIDAQAB
-----END PUBLIC KEY-----
[MODULE]
name = form-mt
[LOGGING]
level = $LOG_LEVEL
[SERVICE]
port=5121
host=0.0.0.0
version_no=1.0.0
workers=1
module_name=$APP_NAME
enable_traceback = True
secure_cookie = $SECURE_COOKIE
[MONGO_DB]
uri= $MONGO_URI
[POSTGRES]
maintenance = $MAINTENANCE_URI
assistant = $ASSISTANT_URI
[DATABASES]
metadata_db=$METADATA_DB
ilens_assistant=$ILENS_ASSISTANT_DB
ilens_asset_model=$ILENS_ASSET_MODEL_DB
[PATH_TO_SERVICES]
scheduler_proxy = $SCHEDULER_PROXY
data_engine=$FORM_DE
form_mt = $FORM_MT
metadata_services=$METADATA_SERVICES
audit_proxy=$AUDIT_PROXY
workflow_mt=$WORKFLOW_MT
ilens_events=$ILENS_EVENTS
oee_services=$OEE_SERVICES
[BACKFILL]
interval=$INTERVAL
trigger_bandwidth=$TRIGGER_BANDWIDTH
[DIRECTORY]
base_path = $BASE_PATH
mount_dir = $MOUNT_DIR
keys_path = data/keys
[REDIS]
uri=$REDIS_URI
login_db = 9
project_tags_db = 18
[KAIROS]
kairos_url = $KAIROS_URI
[KAFKA]
host=$KAFKA_HOST
port=$KAFKA_PORT
topic=$KAFKA_TOPIC
history_topic=$KAFKA_HISTORY_OUTPUT_TOPIC
audit_topic=$KAFKA_AUDIT_TOPIC
enable_sites_partition=$ENABLE_KAFKA_PARTITION
split_key=$KAFKA_PARTITION_KEY
round_robin_enable=$ROUND_ROBIN_PARTITION
partition_db=13
[AUDITING]
periodic_entry_auditing=$PERIODIC_ENTRY_AUDITING
form_non_periodic_auditing=$FORM_NON_PERIODIC_AUDITING
form_periodic_auditing=$FORM_PERIODIC_AUDITING
[PATH_TO_OTHER_SERVICES]
email_service = $EMAIL_SERVICE_PROXY
[MQTT]
uri = $MQTT_URI
host = $MQTT_URL
port = $MQTT_PORT
publish_base_topic = ilens/notifications
[EVENTS]
enable_events=$ENABLE_EVENTS
# Add in environment variables here when updated for better collaboration
MODULE_NAME=form-management
MONGO_URI=mongodb://192.168.0.220:2717/
METADATA_DB=ilens_configuration
ILENS_ASSISTANT=ilens_assistant
ILENS_ASSET_MODEL_DB=ilens_asset_model
ASSISTANT_URI=postgresql://postgres:postgres@192.168.0.220:5432/ilens_assistant
FORM_DE=http://192.168.0.220/formde/
METADATA_SERVICES=http://192.168.0.220/ilens_api/
KAIROS_URI=http://192.168.0.220:8080/
BASE_PATH=opt/services/ilens2.0/volumes
MOUNT_DIR=/form-management
REDIS_URI = redis://192.168.0.220:6379
KAFKA_HOST=192.168.0.220
KAFKA_PORT=9092
KAFKA_TOPIC=ilens_dev
KAFKA_HISTORY_OUTPUT_TOPIC=ilens_dev_backup
KAFKA_AUDIT_TOPIC=audit_logs
MAINTENANCE_URI = postgresql://postgres:postgres@192.168.0.220:5432/maintenance_logbook
FORM_MT = http://192.168.0.220/form-mt/
PERIODIC_ENTRY_AUDITING=true
FORM_NON_PERIODIC_AUDITING=true
FORM_PERIODIC_AUDITING=true
ENABLE_KAFKA_PARTITION=true
ROUND_ROBIN_PARTITION=true
INTERVAL=60
EMAIL_SERVICE_PROXY=https://cloud.ilens.io/sms-util
MQTT_URL=192.168.0.220
MQTT_PORT=1883
SECURE_COOKIE=False
CORS_URLS=staging.ilens.io
SW_DOCS_URL=/docs
SW_OPENAPI_URL=/openapi.json
ENABLE_CORS=True
AUDIT_PROXY=http://192.168.0.220/audit_tracker
LOG_LEVEL=QTRACE
\ No newline at end of file
import os
from fastapi import FastAPI, Depends
from fastapi.middleware.cors import CORSMiddleware
from jwt_signature_validator.encoded_payload import (
EncodedPayloadSignatureMiddleware as SignatureVerificationMiddleware
)
from scripts.config.app_configurations import Service
from scripts.constants.app_constants import Secrets
from scripts.core.services import (
stage_router,
comment_router,
remark_router,
form_router,
custom_router,
stages_data_router,
mobile_task_router,
health_status
)
from scripts.utils.security_utils.decorators import CookieAuthentication
secure_access = os.environ.get("SECURE_ACCESS", default=False)
auth = CookieAuthentication()
app = FastAPI(
title="iLens Assistant Form Management",
version="1.0.0",
root_path="form-mt",
description="Form Management App",
openapi_url=os.environ.get("SW_OPENAPI_URL"),
docs_url=os.environ.get("SW_DOCS_URL"),
redoc_url=None,
)
if Service.verify_signature in [True, "True", 'true']:
app.add_middleware(
SignatureVerificationMiddleware,
jwt_secret=Secrets.signature_key,
jwt_algorithms=Secrets.signature_key_alg,
protect_hosts=Service.protected_hosts,
)
if os.environ.get("ENABLE_CORS") in (True, 'true', 'True') and os.environ.get("CORS_URLS"):
app.add_middleware(
CORSMiddleware,
allow_origins=os.environ.get("CORS_URLS").split(","),
allow_credentials=True,
allow_methods=["GET", "POST", "DELETE", "PUT"],
allow_headers=["*"],
)
auth_enabled = [Depends(auth)] if secure_access in [True, 'true', 'True'] else []
app.include_router(stage_router, dependencies=auth_enabled)
app.include_router(comment_router, dependencies=auth_enabled)
app.include_router(remark_router, dependencies=auth_enabled)
app.include_router(form_router, dependencies=auth_enabled)
app.include_router(custom_router, dependencies=auth_enabled)
app.include_router(stages_data_router, dependencies=auth_enabled)
app.include_router(mobile_task_router, dependencies=auth_enabled)
app.include_router(health_status)
# aiofiles~=0.8.0
# aiohttp~=3.8.1
# crypto~=1.4.1
# pydantic~=1.9.0
# python-dateutil~=2.8.2
cryptography~=36.0.1
fastapi~=0.73.0
formio-data~=0.3.14
httpx~=0.22.0
ilens-kafka-publisher==0.4.2
jwt-signature-validator~=0.0.1
kafka-python~=2.0.2
numpy~=1.22.2
openpyxl~=3.0.9
paho-mqtt~=1.6.1
pandas~=1.4.1
pre-commit~=2.20.0
psycopg2-binary~=2.9.3
pyjwt~=2.3.0
pymongo~=4.0.1
python-dotenv~=0.19.2
python-multipart~=0.0.5
pytz~=2021.3
pyyaml~=6.0
redis~=4.1.4
requests~=2.27.1
sqlalchemy-utils~=0.38.2
sqlalchemy==1.4.31
uvicorn~=0.17.5
\ No newline at end of file
This diff is collapsed.
import shutil
from scripts.config.app_configurations import config
if __name__ == '__main__':
from dotenv import load_dotenv
load_dotenv()
import os.path
class KeyPath(object):
keys_path = config['DIRECTORY']['keys_path']
if not os.path.isfile(os.path.join(keys_path, "public")) or not os.path.isfile(
os.path.join(keys_path, "private")):
if not os.path.exists(keys_path):
os.makedirs(keys_path)
shutil.copy(os.path.join("assets", "keys", "public"), os.path.join(keys_path, "public"))
shutil.copy(os.path.join("assets", "keys", "private"), os.path.join(keys_path, "private"))
public = os.path.join(keys_path, "public")
private = os.path.join(keys_path, "private")
"""
This file exposes configurations from config file and environments as Class Objects
"""
import shutil
if __name__ == '__main__':
from dotenv import load_dotenv
load_dotenv()
import os.path
import sys
from configparser import ConfigParser, BasicInterpolation
class EnvInterpolation(BasicInterpolation):
"""
Interpolation which expands environment variables in values.
"""
def before_get(self, parser, section, option, value, defaults):
value = super().before_get(parser, section, option, value, defaults)
if not os.path.expandvars(value).startswith('$'):
return os.path.expandvars(value)
else:
return
try:
config = ConfigParser(interpolation=EnvInterpolation())
config.read("conf/application.conf")
except Exception as e:
print(f"Error while loading the config: {e}")
print("Failed to Load Configuration. Exiting!!!")
sys.stdout.flush()
sys.exit()
class Service:
MODULE_NAME = config["MODULE"]["name"]
HOST = config.get("SERVICE", "host")
PORT = config.getint("SERVICE", "port")
secure_cookie = config.getboolean("SERVICE", "secure_cookie", fallback=True)
verify_signature = os.environ.get("VERIFY_SIGNATURE", False)
protected_hosts = os.environ.get("PROTECTED_HOSTS", "").split(",")
class DBConf:
MONGO_URI = config.get('MONGO_DB', 'uri')
if not MONGO_URI:
print("Error, environment variable MONGO_URI not set")
sys.exit(1)
MAINTENANCE_DB_URI = config.get('POSTGRES', "maintenance")
if not MAINTENANCE_DB_URI:
print("MAINTENANCE_DB_URI env variables missing")
sys.exit(1)
ASSISTANT_DB_URI = config.get('POSTGRES', "assistant")
if not ASSISTANT_DB_URI:
print("ASSISTANT_DB_URI env variables missing")
sys.exit(1)
class KafkaConf:
host = config.get('KAFKA', 'host')
port = config.get('KAFKA', 'port')
topic = config.get('KAFKA', 'topic')
backdated_topic = config.get('KAFKA', 'history_topic')
if not any([topic, host, port]):
print("KAFKA env variables missing, continuing without Kafka/Kairos support")
audit_topic = config.get('KAFKA', 'audit_topic')
enable_sites_partition = config.getboolean("KAFKA", "ENABLE_KAFKA_PARTITION", fallback=True)
split_key = config["KAFKA"].get('KAFKA_PARTITION_KEY', 'site_id')
round_robin_enable = config.getboolean("KAFKA", "ROUND_ROBIN_PARTITION", fallback=True)
redis_db = config.getint("KAFKA", "partition_db")
class Logging:
level = config.get("LOGGING", "level", fallback="INFO")
level = level or "INFO"
print(f"Logging Level set to: {level}")
class StoragePaths:
module_name = config.get('SERVICE', 'module_name')
if not module_name:
module_name = "form_management"
base_path = os.path.join("data", module_name)
class DatabaseConstants:
metadata_db = config.get("DATABASES", "metadata_db")
if not bool(metadata_db):
metadata_db = "ilens_configuration"
ilens_assistant_db = config.get("DATABASES", "ilens_assistant")
if not bool(ilens_assistant_db):
ilens_assistant_db = "ilens_assistant"
ilens_asset_model_db = config.get("DATABASES", "ilens_asset_model")
if not bool(ilens_asset_model_db):
ilens_asset_model_db = "ilens_asset_model"
class PathToServices:
DATA_ENGINE = config.get("PATH_TO_SERVICES", "data_engine")
if not bool(DATA_ENGINE):
print("FORM_DE not set, proceeding without data engine support")
METADATA_SERVICES = config.get("PATH_TO_SERVICES", "metadata_services")
if not bool(METADATA_SERVICES):
print("METADATA_SERVICES not set, proceeding without metadata_services support")
AUDIT_PROXY = config.get("PATH_TO_SERVICES", "audit_proxy")
if not bool(AUDIT_PROXY):
print("AUDIT_PROXY not set, proceeding without audit_proxy support")
WORKFLOW_MT = config.get("PATH_TO_SERVICES", "workflow_mt")
if not bool(WORKFLOW_MT):
print("WORKFLOW_MT_PROXY not set, proceeding without audit_proxy support")
FORM_MT = config.get("PATH_TO_SERVICES", "form_mt")
if not bool(FORM_MT):
print("Error, environment variable FORM_MT not set")
sys.exit(1)
ILENS_EVENTS = config.get("PATH_TO_SERVICES", "ilens_events")
if not bool(ILENS_EVENTS):
print("Error, environment variable ILENS_EVENTS not set")
sys.exit(1)
OEE_SERVICES = config.get("PATH_TO_SERVICES", "oee_services")
if not bool(OEE_SERVICES):
print("Error, environment variable OEE_SERVICES not set")
sys.exit(1)
class PathToStorage:
BASE_PATH = config.get("DIRECTORY", "base_path")
if not BASE_PATH:
print("Error, environment variable BASE_PATH not set")
sys.exit(1)
MOUNT_DIR = config.get("DIRECTORY", "mount_dir")
if not MOUNT_DIR:
print("Error, environment variable MOUNT_DIR not set")
sys.exit(1)
MODULE_PATH = os.path.join(BASE_PATH, MOUNT_DIR.lstrip('/'))
FORM_IO_UPLOADS = os.path.join(MODULE_PATH, "form_io_uploads")
TEMPLATES_UPLOADS = os.path.join(MODULE_PATH, "templates_uploads")
LOGS_MODULE_PATH = f"{BASE_PATH}/logs{MOUNT_DIR}/"
class KeyPath:
keys_path = config['DIRECTORY']['keys_path']
if not os.path.isfile(os.path.join(keys_path, "public")) or not os.path.isfile(os.path.join(keys_path, "private")):
if not os.path.exists(keys_path):
os.makedirs(keys_path)
shutil.copy(os.path.join("assets", "keys", "public"), os.path.join(keys_path, "public"))
shutil.copy(os.path.join("assets", "keys", "private"), os.path.join(keys_path, "private"))
public = os.path.join(keys_path, "public")
private = os.path.join(keys_path, "private")
class RedisConfig:
uri = config.get("REDIS", "uri")
login_db = config["REDIS"]["login_db"]
project_tags_db = config.getint("REDIS", "project_tags_db")
class KairosConfig:
uri = config.get("KAIROS", "kairos_url")
class BackFill:
interval_in_mins = config.get("BACKFILL", "interval", fallback=60)
trigger_step_threshold = config.getint("BACKFILL", "trigger_bandwidth", fallback=300)
class EnableAuditing:
periodic_entry_auditing = config.getboolean("AUDITING", "periodic_entry_auditing", fallback=False)
form_non_periodic_auditing = config.getboolean("AUDITING", "form_non_periodic_auditing", fallback=False)
form_periodic_auditing = config.getboolean("AUDITING", "form_periodic_auditing", fallback=False)
class OtherService:
EMAIL_URL = config["PATH_TO_OTHER_SERVICES"]["email_service"]
class MQTTConf:
uri = config["MQTT"]["uri"]
host = config["MQTT"]["host"]
port = int(config["MQTT"]["port"])
publish_base_topic = config["MQTT"]["publish_base_topic"]
class EnableEvents:
enable_events = config.getboolean("EVENTS", "enable_events", fallback=True)
from scripts.config.app_configurations import DatabaseConstants
class Secrets:
LOCK_OUT_TIME_MINS = 30
leeway_in_mins = 10
unique_key = '45c37939-0f75'
token = '8674cd1d-2578-4a62-8ab7-d3ee5f9a'
issuer = "ilens"
alg = "RS256"
class StatusCodes:
SUCCESS = [200, 201, 204]
class UserCollectionKeys:
KEY_LANGUAGE = "language"
KEY_NAME = "name"
KEY_USER_ID = "user_id"
KEY_PROJECT_ID = "project_id"
KEY_USERNAME = "username"
KEY_USER_ROLE = "userrole"
KEY_EMAIL = "email"
class DatabaseNames:
ilens_configuration = DatabaseConstants.metadata_db
ilens_assistant = DatabaseConstants.ilens_assistant_db
class CollectionNames:
form_props = "form_props"
scheduled_info = "scheduled_info"
templates = "templates"
forms = "forms"
unique_id = "unique_id"
user = "user"
user_project = "user_project"
lookup_table = "lookup_table"
template_category = "template_category"
step_category = "step_category"
constants = "constants"
workflows = "workflows"
workflow_permissions = "workflow_permissions"
triggers = "triggers"
task_instance_data = "task_instance_data"
product_master = "product_master"
periodic_data = "periodic_data"
project_remarks = "project_remarks"
action_templates = "action_templates"
user_role = "user_role"
shift_details = "shift_details"
site_conf = "site_conf"
customer_projects = "customer_projects"
logbook = "logbook"
logbook_links = "logbook_links"
job_list = "job_list"
schedule_metadata = "schedule_metadata"
tasks = "task_info"
task_instance = "task_instances"
steps = "steps"
step_templates = "step_templates"
step_data_files = "step_data_files"
class CommonKeys:
KEY_USER_ID = "user_id"
KEY_PROCESS_TEMPLATE = "process_template"
KEY_SITE_TEMPLATE = "site_template"
KEY_PROCESS_TEMPLT_ID = "process_templt_id"
KEY_KEY_LIST = "key_list"
KEY_VALUE = "value"
KEY_SITE_TEMPLT_ID = "site_templt_id"
KEY_TYPE = "type"
KEY_LOOKUP = "lookup_name"
KEY_CREATED_BY = "created_by"
KEY_CREATED_TIME = "created_at"
KEY_UPDATED_AT = "updated_by"
KEY_LAST_UPDATED_TIME = "updated_at"
class StepCategories:
NON_PERIODIC = "step_category_100"
TASK_CREATION = "step_category_101"
PERIODIC = "step_category_102"
TRIGGER_BASED = "step_category_103"
class FactorsInTriggerCompletion:
CONSOLIDATED = ["end_of_day", "end_of_week"]
END_OF_DAY = "end_of_day"
END_OF_WEEK = "end_of_week"
class UniqueIdKeys:
KEY_ID = "id"
KEY_KEY = "key"
class CustomObjects:
model_types_for_psql_tables = ["production_losses"]
custom_models_to_list = ["ope_formula_calculation", "oee_formula_calculation", "rm_consumption", "util_std_norm",
"gen_std_norm",
"oee_production_losses", "oee_daily_production", "oee_master_table"]
oee_production_losses = "oee_production_losses"
class CommonConstants(object):
__iso_format__ = '%Y-%m-%dT%H:%M:%S%z'
class FormEndPoints:
api_form = "/form"
api_mobile_form_multiple = "/form_load_multiple"
api_wrk_task_details = "/mobile/wrk_task_details"
api_stage = "/stage"
api_stages_data = "/stage_data"
api_add_periodic_data = "/add_periodic_data"
api_get_tags = "/get_tags"
api_get_time_list = "/get_time_list"
api_timewise_tags = "/timewise_tags"
api_trigger = "/trigger"
api_trigger_task_completion = "/trigger_task_completion"
api_mark_task_complete = "/mark_task_complete"
api_custom = "/custom"
api_reference = "/reference"
api_save_table = "/save_table"
api_list_periodic_steps = "/list_periodic_steps"
api_get = "/get"
api_list = "/list"
api_render = "/render"
api_mobile = "/mobile"
api_remark = "/remark"
api_save = "/save"
api_send_notification = "/send_notification"
api_copy_property_values = "/copy_property_values"
api_backfill = "utils/periodic_data/auto/insert"
api_search_asset = "tags_v2/search_asset"
api_get_user_details = "ilens_config/get_user_details"
class CommentsEndPoints:
api_comment = "/comments"
api_list = "/list"
class DataEngineEndpoints:
api_schedule = "schd/task/schedule"
api_iot_param = "iot_param/get"
class OeeServicesEndpoints:
api_create_batch = "/calculator/batch_oee/calculate"
class CustomEndPoints:
api_save_table = f"custom{FormEndPoints.api_save_table}"
class StageDataEndPoints:
api_create_template = "/save"
api_list_template = "/list"
api_template_table_options = "/template_table_options"
api_delete_template = "/delete"
api_fetch_template = "/fetch"
api_download_template = "/template_download"
api_upload_data_sheet = "/upload_data_file"
api_get_templates = "/get_templates"
api_get_file_data_list = "/uploaded_file_list"
api_download_data_file = "/download_data_file"
api_delete_data_file = "/delete_data_file"
api_back_fill_data = "utils/periodic_data/direct/insert"
class OtherEndPoints:
api_send_email = "/api/v1/eim/email/send"
class EventsEndPoints:
api_create_event = "/api/event/create"
from scripts.config.app_configurations import DatabaseConstants
class DatabaseNames:
ilens_configuration = DatabaseConstants.metadata_db
ilens_assistant = DatabaseConstants.ilens_assistant_db
ilens_asset_model = DatabaseConstants.ilens_asset_model_db
class CollectionNames:
form_props = "form_props"
scheduled_info = "scheduled_info"
forms = "forms"
unique_id = "unique_id"
user = "user"
tags = "tags"
shifts = "shifts"
lookup_table = "lookup_table"
template_category = "template_category"
step_category = "step_category"
constants = "constants"
workflows = "workflows"
workflow_permissions = "workflow_permissions"
triggers = "triggers"
product_master = "product_master"
periodic_data = "periodic_data"
reference_steps = "reference_steps"
project_remarks = "project_remarks"
action_templates = "action_templates"
user_role = "user_role"
shift_details = "shift_details"
site_conf = "site_conf"
customer_projects = "customer_projects"
logbook = "logbook"
logbook_links = "logbook_links"
job_list = "job_list"
schedule_metadata = "schedule_metadata"
steps = "steps"
trigger_steps = "trigger_steps"
task_instance_data = "task_instance_data"
task_instances = "task_instances"
tasks = "task_info"
asset_model_details = "asset_model_details"
rule_targets = "rule_targets"
class TaskInstanceDataKeys:
KEY_STAGE_ID = "stage_id"
KEY_TASK_ID = "task_id"
KEY_STATUS = "status"
KEY_STEP_ID = "step_id"
class Secrets:
LOCK_OUT_TIME_MINS = 30
leeway_in_mins = 10
unique_key = '45c37939-0f75'
token = '8674cd1d-2578-4a62-8ab7-d3ee5f9a'
issuer = "ilens"
alg = "RS256"
signature_key = 'kliLensKLiLensKL'
signature_key_alg = ["HS256"]
class SiteConfCollectionKeys:
KEY_SITE_NAME = "site_name"
KEY_SITE_INFO = "site_info"
KEY_CUSTOMER_PROJECT_ID = "customer_project_id"
KEY_SITE_ID = "site_id"
KEY_PRODUCT_ENCRYPTED = "product_encrypted"
KEY_PROCESS_ID = "process_id"
class AuditingKeys:
periodic = "periodic"
trigger_based = "trigger-based"
non_periodic = "non-periodic"
user = "user"
machine = "machine"
data_published = "data_published"
success = "success"
failed = "failed"
class CustomerProjectKeys:
KEY_CUSTOMER_PROJECT_ID = "customer_project_id"
KEY_CUSTOMER_PROJECT_NAME = "customer_project_name"
class TaskInstanceDataKeys:
KEY_STAGE_ID = "stage_id"
KEY_TASK_ID = "task_id"
KEY_STATUS = "status"
KEY_STEP_ID = "step_id"
class CustomKeys:
ACTUAL_PRODUCTION_MTD = "actual_production_mtd"
CAPACITY_FOR_SHIFT_SUMMARY_MTD = "capacity_for_shift_mtd"
PRODUCTION_LOSS_MTD = "production_loss_mtd"
OPE_MTD = "ope_mtd"
class CommonStatusCode:
SUCCESS_CODES = (
200,
201,
204,
)
class SubmitAction:
refresh = "refresh"
save = "save"
view = "view"
class EmailAuth:
username = 'AllGoodNamesRGone'
password = 'comBRANSeANtamasEbICaPeC'
ui_time_format_data = {
"dd/MM/yyyy HH:mm:ss": "%d/%m/%Y %H:%M:%S",
"dd-MM-yyyy HH:mm:ss": "%d-%m-%Y %H:%M:%S",
"yyyy/dd/MM HH:mm:ss": "%Y/%d/%m %H:%M:%S",
"yyyy-dd-MM HH:mm:ss": "%Y-%d-%m %H:%M:%S",
"yyyy/MM/dd HH:mm:ss": "%Y/%m/%d %H:%M:%S",
"yyyy-MM-dd HH:mm:ss": "%Y-%m-%d %H:%M:%S",
"dd/MM/yyyy": "%d/%m/%Y",
"dd-MM-yyyy": "%d-%m-%Y",
"yyyy/dd/MM": "%Y/%d/%m",
"yyyy-dd-MM": "%Y-%d-%m",
"yyyy/MM/dd": "%Y/%m/%d",
"yyyy-MM-dd": "%Y-%m-%d",
"MM/dd/yyyy": "%m/%d/%Y",
"MM-dd-yyyy": "%m-%d-%Y",
"yyyy-dd-MM HH:mm": "%Y-%m-%d %H:%M",
"yyyy-MM-dd HH:mm": "%Y-%m-%d %H:%M",
"dd-MM HH:mm:ss": "%d-%m %H:%M:%S",
"dd/MM HH:mm:ss": "%d/%m %H:%M:%S",
"MM-dd HH:mm:ss": "%m-%d %H:%M:%S",
"MM/dd HH:mm:ss": "%m/%d %H:%M:%S",
"monthYear": "%b, %Y",
"dateMonth": "%d %b",
"monthDate": "%b %d",
"dateMonthYear": "%d %b, %Y",
"yearDateMonth": "%Y, %d %b",
"monthDateYear": "%b %d, %Y",
"yearMonthDate": "%Y, %b %d",
"MonthDateYear": "%B %d, %Y",
"HH:mm": "%H:%M",
"None": None
}
date_time_with_hour = "%d-%m-%Y %H:%M"
class StepRecordKeys:
KEY_STEP_ID = "step_id"
KEY_PROJECT_ID = "project_id"
KEY_STEP_NAME = "step_name"
class TaskKeys:
KEY_PROJECT_ID = "project_id"
KEY_TASK_ID = "task_info_id"
KEY_TASK_INSTANCE = "task_id"
KEY_TASK_CREATION_DATA = "task_creation_data"
KEY_ASSOCIATED_WORKFLOW_ID = "associated_workflow_id"
KEY_WORKFLOW_VERSION = "workflow_version"
KEY_CURRENT_STATUS = "current_status"
class BprRecordKeys:
KEY_BPR_ID = "bpr_id"
KEY_PROJECT_ID = "project_id"
class TemplateCategoryKeys:
KEY_TEMPLATE_CATEGORY_ID = "template_category_id"
KEY_TEMPLATE_NAME = "name"
class ScheduledInfoKeys:
KEY_STEP_ID = "step_id"
KEY_SCHEDULE_PROPERTIES = "schedule_properties"
KEY_SCHEDULE_ID = "schedule_id"
KEY_JOB_ID = "job_id"
class FormPropsKeys:
KEY_STEP_ID = "step_id"
KEY_FORM_INFO = "form_info"
class ComponentInstanceKeys:
KEY_WORKFLOW_COMPONENT_ID = "workflow_component_id"
KEY_NODE_PLAYGROUND_ID = "node_playground_id"
class WorkflowKeys:
KEY_WORKFLOW_ID = "workflow_id"
KEY_WORKFLOW_VERSION = "workflow_version"
KEY_PROJECT_ID = "project_id"
KEY_WORKFLOW_NAME = "workflow_name"
class TagKeys:
KEY_TAG_ID = "id"
KEY_TAG_NAME = "tag_name"
class WorkflowPermissionsKeys:
KEY_WORKFLOW_STATUS = "workflow_status"
KEY_STEP_ID = "step_id"
KEY_WORKFLOW_ID = "workflow_id"
KEY_WORKFLOW_VERSION = "workflow_version"
KEY_USER_ROLE = "user_role"
KEY_PERMISSIONS = "permissions"
class TriggerKeys:
KEY_TRIGGER_ID = "trigger_id"
KEY_TRIGGER_TYPE = "trigger_type"
KEY_ROLE = "role"
class WorkflowInstanceKeys:
KEY_WORKFLOW_INSTANCE_ID = "workflow_instance_id"
KEY_WORKFLOW_SPEC_ID = "workflow_spec_id"
KEY_VERSION_ID = "version_id"
KEY_BUILDER_ID = "builder_id"
KEY_PROJECT_ID = "project_id"
KEY_STATUS = "status"
class PeriodicDataKeys:
KEY_STEP_ID = "step_id"
KEY_DATE = "date"
KEY_DATA = "data"
KEY_MANUAL_DATA = "manual_data"
class StageDataKeys(WorkflowInstanceKeys):
KEY_STAGE_ID = "stage_id"
KEY_TEMPLATE_ID = "template_id"
KEY_DATA = "data"
KEY_STATUS = "status"
STAGE_TYPE = "stage_type"
KEY_IS_DELETED = "is_deleted"
KEY_REMARKS = "remarks"
KEY_STAGE_CONFIGURATION = "stage_configuration"
class ProjectRemarksKeys:
KEY_PROJECT_ID = "project_id"
Key_REMARKS = "remarks"
# mobility
class ProductMasterDataKeys:
KEY_ID = "id"
KEY_NAME = "name"
# mobility
class ShiftDetailsKeys:
KEY_PROJECT_ID = "project_id"
KEY_USER_ID = "user_id"
KEY_SHIFT_END_TIME = "shift_end_time"
KEY_SHIFT_START_TIME = "shift_start_time"
KEY_IS_STARTED = "is_started"
class StepsCategoryKeys:
pass
class TemplateRecordKeys:
pass
class ReferenceDataKeys:
KEY_STEP_ID = "step_id"
KEY_DATE = "date"
KEY_PROPERTIES = "properties"
KEY_DATA = "data"
KEY_STEP_CATEGORY = "step_category"
KEY_ENTITY_NAME = "entity_name"
KEY_EVENT_ID = "event_id"
KEY_TASK_ID = "task_id"
class StepTemplateKeys:
KEY_PROJECT_ID = "project_id"
KEY_TEMPLATE_ID = "template_id"
KEY_TEMPLATE_NAME = "template_name"
KEY_LOGBOOK_ID = "logbook_id"
class StepDataFileKeys:
KEY_PROJECT_ID = "project_id"
KEY_TEMPLATE_ID = "template_id"
KEY_FILE_ID = "file_id"
KEY_FILE_NAME = "file_name"
class AssetDetailsKeys:
KEY_PROJECT_ID = "project_id"
KEY_ASSET_MODEL_NAME = "asset_model_name"
KEY_ASSET_MODEL_ID = "asset_model_id"
KEY_ASSET_DESCRIPTION = "asset_description"
KEY_ASSET_VERSION = "asset_version"
KEY_ASSET_MODEL_ICON = "asset_model_icon"
class DataNotFound(Exception):
pass
class KairosDBError(Exception):
pass
class LogbookConstants:
external_action_data = [
{
"action": "addnew",
"label": "Create New",
"type": "addnew"
}
]
table_actions_action_data = [
{
"action": "edit",
"label": "Edit",
"type": "edit",
"icon-class": "fa fa-pencil"
},
{
"action": "delete",
"label": "Delete",
"type": "delete",
"icon-class": "fa fa-trash"
}
]
headerContent = [
{
"value": "logbook_name",
"label": "Logbook Name",
"enable_column_search": True,
"header_type": "text",
"action": {
"action": "edit",
"type": "edit",
"label": "Edit"
},
"enableClick": True,
"style": "indicate-item cursor-pointer"
},
{
"value": "logbook_description",
"label": "Description",
"enable_column_search": True,
"header_type": "text"
},
# {"value": "workflow_name",
# "label": "Workflow",
# "enable_column_search": True,
# "header_type": "select",
# "options": [
# ]
# },
{
"label": "Business Process Tags",
"value": "business_process_tags",
"enable_column_search": True,
"header_type": "text"
},
{
"value": "updated_on",
"label": "Last Modified on",
"enable_column_search": True,
"header_type": "date_range"
},
{
"value": "updated_by",
"label": "Last Modified by",
"enable_column_search": True,
"header_type": "select",
"options": [
{
"label": "Login module",
"value": "Login module"
}
]
},
]
class StageConstants:
mark_as_completed = {
"label": "Mark Step as Completed",
"value": "mark_complete",
"type": "toggle",
"check_completion": False,
"properties": {
"btn_class": "btn-primary",
"bg_color": "#0f62fe",
"icon": "fa-floppy-o",
"class": "pull-right"
}
}
mark_complete_icon = "fa fa-check-circle text-success"
mark_complete_icon_color = "#20f952"
class TemplateListConstants:
external_action_data = [
{
"action": "addnew",
"label": "Create New",
"type": "addnew"
}
]
table_actions_action_data = [
{
"action": "edit",
"label": "Edit",
"type": "edit",
"icon-class": "fa fa-pencil"
},
{
"action": "delete",
"label": "Delete",
"type": "delete",
"icon-class": "fa fa-trash"
}
]
headerContent = [
{
"value": "step_name",
"label": "Step Name",
"enable_column_search": True,
"header_type": "text",
"action": {
"action": "edit",
"label": "edit",
"type": "edit",
"icon-class": "fa-eye"
},
"enableClick": True,
"style": "indicate-item cursor-pointer"
},
{
"value": "description",
"label": "Description",
"enable_column_search": True,
"header_type": "text"
},
{
"value": "created_by",
"label": "Created By",
"enable_column_search": True,
"header_type": "select",
"options": [
{
"label": "Login module",
"value": "Login module"
}
]
},
{
"value": "created_on",
"label": "Created On",
"enable_column_search": True,
"header_type": "date_range"
},
{
"value": "step_category",
"label": "Step Type",
"enable_column_search": True,
"header_type": "select",
"options": [
]
}
]
class TemplateConstants:
external_action_data = [
]
table_actions_action_data = [
{
"action": "delete",
"label": "Delete",
"type": "delete",
"icon-class": "fa fa-trash"
}
]
headerContent = [
{
"value": "template_name",
"label": "Template Name",
"enable_column_search": True,
"header_type": "text",
"action": {
"action": "edit",
"type": "edit",
"label": "Edit"
},
"enableClick": True,
"style": "indicate-item cursor-pointer"
},
{
"value": "logbook_name",
"label": "Logbook Name",
"enable_column_search": True,
"header_type": "text"
},
{
"value": "updated_by",
"label": "Last Modified by",
"enable_column_search": True,
"header_type": "select",
"options": [
]
},
{
"value": "updated_on",
"label": "Last Modified on",
"enable_column_search": True,
"header_type": "date_range"
},
]
class TemplateStorage:
templates_files = "templates"
upload_data_files = "data_files"
class WorkflowConstants:
external_action_data = [
{
"action": "addnew",
"label": "Create New",
"type": "addnew"
}
]
table_actions_action_data = [
{
"action": "edit",
"label": "Edit",
"type": "edit",
"icon-class": "fa fa-pencil"
},
{
"action": "delete",
"label": "Delete",
"type": "delete",
"icon-class": "fa fa-trash"
}
]
headerContent = [
{
"value": "workflow_name",
"label": "Workflow Name",
"enable_column_search": True,
"header_type": "text",
"action": {
"action": "edit",
"label": "Edit",
"type": "edit",
"icon-class": "fa fa-pencil"
},
"enableClick": True,
"style": "indicate-item cursor-pointer"
},
{
"value": "workflow_description",
"label": "Description",
"enable_column_search": True,
"header_type": "text"
},
{
"value": "tags",
"label": "Tags",
"enable_column_search": True,
"header_type": "text"
},
{
"value": "created_by",
"label": "Created By",
"enable_column_search": True,
"header_type": "select",
"options": [
{
"label": "Login module",
"value": "Login module"
}
]
},
{
"value": "created_on",
"label": "Created On",
"enable_column_search": True,
"header_type": "date_range"
}
]
This diff is collapsed.
import os
import httpx
from scripts.config.app_configurations import PathToServices
from scripts.constants.api import OeeServicesEndpoints
from scripts.constants.app_constants import CommonStatusCode
from scripts.constants.date_constants import ui_time_format_data
from scripts.core.engine.task_engine import TaskEngine
from scripts.core.schemas.forms import CustomActionsModel
from scripts.db import mongo_client, TaskInstance
from scripts.db.mongo.ilens_assistant.collections.logbook import LogbookInfo
from scripts.db.mongo.ilens_configuration.aggregations.config_aggregate import ConfigAggregate
from scripts.db.mongo.ilens_configuration.collections.customer_projects import CustomerProjects
from scripts.errors import InternalError
from scripts.logging.logging import logger
from scripts.utils.common_utils import CommonUtils
class CustomAction:
def __init__(self, custom_action: CustomActionsModel):
self.custom_model: CustomActionsModel = custom_action
self.customer_projects_con = CustomerProjects(mongo_client=mongo_client)
self.config_aggregate = ConfigAggregate()
self.task_engine = TaskEngine(project_id=self.custom_model.project_id)
self.task_inst_conn = TaskInstance(mongo_client, project_id=custom_action.project_id)
self.logbook_conn = LogbookInfo(mongo_client=mongo_client, project_id=custom_action.project_id)
self.common_utils = CommonUtils()
self.create_batch_api = f"{PathToServices.OEE_SERVICES}{OeeServicesEndpoints.api_create_batch}"
def trigger_action(self):
try:
site_templates = self.customer_projects_con.get_project_data_by_aggregate(
self.config_aggregate.get_project_template(self.custom_model.project_id))
site_templates = site_templates[0].get("data") if bool(site_templates) else []
hierarchy_id_str = ""
task_data = self.task_inst_conn.find_by_task_id(task_id=self.custom_model.task_details.task_id)
logbook_data = self.logbook_conn.find_by_id(task_data.logbook_id)
if hierarchy := self.task_engine.get_hierarchy(logbook_data.dict(), task_data.dict()):
hierarchy_id_str = self.task_engine.get_hierarchy_string(hierarchy, site_templates)
task_creation_time = task_data.meta.get("created_at")
start_property_name = os.environ.get("OEE_START_TIME_KEY", default="oee_start_time")
prod_start_time = self.common_utils.get_task_time(task_time=task_creation_time,
custom_model=self.custom_model,
task_property_name=start_property_name)
prod_start_time = self.common_utils.get_iso_format(timestamp=int(prod_start_time.timestamp()),
timezone=self.custom_model.tz,
timeformat=ui_time_format_data["yyyy-MM-dd HH:mm:ss"])
payload = dict(reference_id=task_data.reference_id,
hierarchy=hierarchy_id_str,
prod_start_time=prod_start_time,
batch_type="create",
project_id=self.custom_model.project_id,
tz=self.custom_model.tz)
with httpx.Client() as client:
resp = client.post(url=self.create_batch_api, cookies=self.custom_model.request_obj.cookies,
json=payload, timeout=15)
logger.debug(f"Resp Code:{resp.status_code}")
if resp.status_code not in CommonStatusCode.SUCCESS_CODES:
logger.error(f"Failed while calling custom API: {resp.status_code}")
# raise InternalError(f"API not callable: Status - {resp.status_code}")
if resp.headers.get('Content-Type').startswith('application/json') or resp.headers.get(
'content-type').startswith('application/json'):
message = resp.json()
else:
message = dict(message="Unable to decode response, API Triggered")
return True, message.get("message", "Batch Created successfully")
except InternalError:
raise
except Exception as e:
logger.error(f"Exception occurred while creating batch: {e}")
import os
import time
from datetime import datetime
import httpx
import pytz
from scripts.config.app_configurations import PathToServices
from scripts.constants.api import OeeServicesEndpoints
from scripts.constants.app_constants import CommonStatusCode
from scripts.constants.date_constants import ui_time_format_data
from scripts.core.engine.task_engine import TaskEngine
from scripts.core.schemas.forms import CustomActionsModel
from scripts.db import mongo_client, TaskInstance
from scripts.db.mongo.ilens_assistant.collections.logbook import LogbookInfo
from scripts.db.mongo.ilens_configuration.aggregations.config_aggregate import ConfigAggregate
from scripts.db.mongo.ilens_configuration.collections.customer_projects import CustomerProjects
from scripts.errors import InternalError
from scripts.logging.logging import logger
from scripts.utils.common_utils import CommonUtils
class CustomAction:
def __init__(self, custom_action: CustomActionsModel):
self.custom_model: CustomActionsModel = custom_action
self.customer_projects_con = CustomerProjects(mongo_client=mongo_client)
self.config_aggregate = ConfigAggregate()
self.task_engine = TaskEngine(project_id=self.custom_model.project_id)
self.task_inst_conn = TaskInstance(mongo_client, project_id=custom_action.project_id)
self.logbook_conn = LogbookInfo(mongo_client=mongo_client, project_id=custom_action.project_id)
self.common_utils = CommonUtils()
self.create_batch_api = f"{PathToServices.OEE_SERVICES}{OeeServicesEndpoints.api_create_batch}"
def trigger_action(self):
try:
site_templates = self.customer_projects_con.get_project_data_by_aggregate(
self.config_aggregate.get_project_template(self.custom_model.project_id))
site_templates = site_templates[0].get("data") if bool(site_templates) else []
hierarchy_id_str = ""
task_data = self.task_inst_conn.find_by_task_id(task_id=self.custom_model.task_details.task_id)
logbook_data = self.logbook_conn.find_by_id(task_data.logbook_id)
if hierarchy := self.task_engine.get_hierarchy(logbook_data.dict(), task_data.dict()):
hierarchy_id_str = self.task_engine.get_hierarchy_string(hierarchy, site_templates)
task_creation_time = task_data.meta.get("created_at")
task_completion_time = task_data.meta.get("completed_at")
start_property_name = os.environ.get("OEE_START_TIME_KEY", default="oee_start_time")
end_property_name = os.environ.get("OEE_END_TIME_KEY", default="oee_end_time")
prod_start_time = self.common_utils.get_task_time(task_time=task_creation_time,
custom_model=self.custom_model,
task_property_name=start_property_name)
prod_end_time = self.common_utils.get_task_time(task_time=task_completion_time,
custom_model=self.custom_model,
task_property_name=end_property_name,
task_type="complete")
if not prod_end_time:
prod_start_time = task_completion_time / 1000 if task_completion_time else time.time()
if prod_end_time > prod_start_time:
prod_end_time = datetime.now(tz=pytz.timezone(self.custom_model.tz))
prod_end_time = self.common_utils.get_iso_format(
timestamp=int(prod_end_time.timestamp()),
timezone=self.custom_model.tz,
timeformat=ui_time_format_data["yyyy-MM-dd HH:mm:ss"])
prod_start_time = self.common_utils.get_iso_format(timestamp=int(prod_start_time.timestamp()),
timezone=self.custom_model.tz,
timeformat=ui_time_format_data["yyyy-MM-dd HH:mm:ss"])
payload = dict(reference_id=task_data.reference_id,
hierarchy=hierarchy_id_str,
project_id=self.custom_model.project_id,
tz=self.custom_model.tz,
prod_start_time=prod_start_time,
prod_end_time=prod_end_time)
with httpx.Client() as client:
resp = client.post(url=self.create_batch_api, cookies=self.custom_model.request_obj.cookies,
json=payload, timeout=15)
logger.debug(f"Resp Code:{resp.status_code}")
if resp.status_code not in CommonStatusCode.SUCCESS_CODES:
logger.error(f"Failed while calling custom API: {resp.status_code}")
# raise InternalError(f"API not callable: Status - {resp.status_code}")
if resp.headers.get('Content-Type').startswith('application/json') or resp.headers.get(
'content-type').startswith('application/json'):
message = resp.json()
else:
message = dict(message="Unable to decode response, API Triggered")
return True, message.get("message", "Batch Created successfully")
except InternalError:
raise
except Exception as e:
logger.error(f"Exception occurred in while finishing batch: {e}")
import time
from datetime import datetime
import pytz
import requests
from scripts.config.app_configurations import PathToServices
from scripts.constants.app_constants import CommonStatusCode
from scripts.core.schemas.forms import CustomActionsModel
from scripts.db import TaskInstance
from scripts.db import mongo_client
from scripts.logging.logging import logger
from scripts.utils.stage_parser import StageParser
class CustomAction:
def __init__(self, custom_action: CustomActionsModel):
self.custom_model: CustomActionsModel = custom_action
self.task_inst_conn = TaskInstance(mongo_client, project_id=custom_action.project_id)
self.stage_parser = StageParser(project_id=custom_action.project_id)
def trigger_action(self):
try:
left_stages = self.stage_parser.get_stage_parser(self.custom_model.task_details.stages).get("left", [])
self.custom_model.task_details.meta.update({"completed_at": int(time.time() * 1000)})
task_meta = {"meta": self.custom_model.task_details.meta, "current_stage": left_stages[-1]}
self.task_inst_conn.update_instance_task(task_id=self.custom_model.task_details.task_id, data=task_meta,
upsert=False)
insert_json = {"task_completed_at": datetime.now(tz=pytz.timezone(self.custom_model.tz)).isoformat()}
request_json = {"service_type": 'update', "data": {"task_id": self.custom_model.task_details.task_id,
"project_id": self.custom_model.project_id,
"data": insert_json}}
try:
api_url = f'{PathToServices.AUDIT_PROXY}/task/tracker'
resp = requests.post(url=api_url, cookies=self.custom_model.request_obj.cookies,
json=request_json)
logger.debug(f"Resp Code:{resp.status_code}")
if resp.status_code in CommonStatusCode.SUCCESS_CODES:
response = resp.json()
logger.debug(f"Response:{response}")
except requests.exceptions.ConnectionError as e:
logger.exception(e.args)
return False, False
except Exception as e:
logger.error(f"Exception occurred in marking stage complete: {e}")
import traceback
from datetime import datetime
import pytz
from scripts.core.schemas.forms import CustomActionsModel
from scripts.db import User, mongo_client
from scripts.logging.logging import logger
from scripts.utils.mqtt_util import push_notification
class CustomAction:
def __init__(self, custom_action: CustomActionsModel):
self.custom_model: CustomActionsModel = custom_action
self.user_conn = User(mongo_client)
def trigger_action(self):
notification = dict(
type="ilens_assistant",
message=f"{self.custom_model.action.get('message', '')}",
notification_message="Notification Generated Successfully",
notification_status="success",
available_at=datetime.now().astimezone(
pytz.timezone(self.custom_model.tz)).strftime("%d-%m-%Y %I:%M%p"),
mark_as_read=False
)
try:
user_data = self.user_conn.find_user_data_with_roles(self.custom_model.action.get("user_roles"),
project_id=self.custom_model.project_id)
for each in user_data:
push_notification(notification, each.get("user_id"))
return False, False
except Exception as e:
notification.update(type="ilens_assistant",
message="Notification failed to generate",
notification_message="Failed to send notification",
notification_status="failed")
logger.error(f"Error while sending notification {e.args}")
logger.error(traceback.format_exc())
import httpx
from scripts.constants import StatusCodes
from scripts.core.schemas.custom_models import CustomRestAPIRequest
from scripts.core.schemas.forms import CustomActionsModel
from scripts.core.schemas.stages import APIAction
from scripts.errors import InternalError
from scripts.logging.logging import logger
class CustomAction:
def __init__(self, custom_action: CustomActionsModel):
self.custom_model: CustomActionsModel = custom_action
def trigger_action(self):
try:
request_object = APIAction(**self.custom_model.action)
try:
headers = {
'login-token': self.custom_model.request_obj.headers.get('login-token',
self.custom_model.request_obj.cookies.get(
'login-token')),
'projectId': self.custom_model.request_obj.cookies.get("projectId",
self.custom_model.request_obj.cookies.get(
"project_id",
self.custom_model.request_obj.headers.get(
"projectId"))),
'userId': self.custom_model.request_obj.cookies.get("userId",
self.custom_model.request_obj.cookies.get(
"user_id",
self.custom_model.request_obj.headers.get(
"userId")))}
if request_object.request_type == "POST":
date = int(self.custom_model.date) / 1000 if self.custom_model.date else 0
request_json = CustomRestAPIRequest(submitted_data=self.custom_model.submitted_data,
stage_id=self.custom_model.stage_id,
project_id=self.custom_model.project_id,
task_id=self.custom_model.task_details.task_id,
tz=self.custom_model.tz, date=date)
logger.info(f"RESTAPI POST PAYLOAD: {request_json.dict()}")
logger.info(f"Headers: {headers} , Cookies: {self.custom_model.request_obj.cookies}")
with httpx.Client() as client:
resp = client.post(url=request_object.api, cookies=self.custom_model.request_obj.cookies,
json=request_json.dict(), headers=headers, timeout=15)
elif request_object.request_type == "GET":
with httpx.Client() as client:
resp = client.get(url=request_object.api, cookies=self.custom_model.request_obj.cookies,
headers=headers,
timeout=15)
else:
raise NotImplementedError
except Exception as e:
logger.error(f"Failed in calling REST API: {e}")
raise InternalError(f"API not callable: {e}") from e
logger.debug(f'{resp.status_code},{resp.text}')
if resp.status_code not in StatusCodes.SUCCESS:
logger.error(f"Failed while calling custom API: {resp.status_code}")
raise InternalError(f"API not callable: Status - {resp.status_code}")
if resp.headers.get('Content-Type').startswith('application/json') or resp.headers.get(
'content-type').startswith('application/json'):
message = resp.json()
else:
message = dict(message="Unable to decode response, API Triggered")
logger.info(f"Message returned in Custom API: {message}")
return True, message.get("message", "API Triggered successfully")
except InternalError:
raise
except Exception as e:
logger.error(f"Exception occurred in rest_api def: {e}")
import time
import traceback
import httpx
from scripts.config.app_configurations import OtherService, PathToServices
from scripts.constants.api import OtherEndPoints, FormEndPoints
from scripts.constants.app_constants import EmailAuth, CommonStatusCode
from scripts.core.schemas.forms import CustomActionsModel
from scripts.core.schemas.other_schemas import EmailRequest
from scripts.db import User
from scripts.db import mongo_client
from scripts.db.mongo.ilens_configuration.collections.rule_targets import RuleTargets
from scripts.logging.logging import logger
class CustomAction:
def __init__(self, custom_action: CustomActionsModel):
self.custom_model: CustomActionsModel = custom_action
self.user_conn = User(mongo_client)
self.rule_targets = RuleTargets(mongo_client)
def get_target_details(self, target_id):
try:
return template.get("data", {}).get("emailId", []) if (
template := self.rule_targets.find_one({"rule_target_id": target_id})) else []
except Exception as e:
logger.exception(f"Exception occurred while fetching target details {e}")
return False
def trigger_action(self):
try:
api_email = f"{OtherService.EMAIL_URL}{OtherEndPoints.api_send_email}"
if self.custom_model.action.get("emailSelectionType", "") == "target":
recipients, payload, mail_ids = self.custom_model.action.get("target_id", ""), [], []
else:
recipients, payload, mail_ids = self.custom_model.action.get("user_roles", []), [], []
if recipients:
if self.custom_model.action.get("emailSelectionType", "") != "target":
user_data = self.get_user_email_ids(self.custom_model, recipients)
mail_ids.extend(each.get("email", "") for each in user_data.get("users", []))
else:
mail_ids = self.get_target_details(recipients)
payload = EmailRequest(receiver_list=mail_ids, from_name="iLens Assistant Tasks",
content=self.custom_model.action.get("message", ""),
subject=self.custom_model.action.get("subject", ""))
logger.info(f"content attached in mail for {self.custom_model.task_details.task_id}")
if payload:
with httpx.Client() as client:
for _ in range(3):
resp = client.post(url=api_email, json=payload.dict(),
auth=(EmailAuth.username, EmailAuth.password))
logger.debug(f"Resp Code:{resp.status_code}")
if resp.status_code in CommonStatusCode.SUCCESS_CODES:
scheduler_response = resp.json()
logger.debug(f"Email Response:{scheduler_response}")
return True
time.sleep(3)
raise RuntimeError("Email Service Not Available")
except Exception as e:
logger.error(f"Error while sending email {e.args}")
logger.error(traceback.format_exc())
return False, False
finally:
return False, False
def get_user_email_ids(self, request_data, roles):
payload = dict(project_id=request_data.project_id, user_role_list=roles, keys=["user"])
api_get_user_details = f"{PathToServices.METADATA_SERVICES}{FormEndPoints.api_get_user_details}"
with httpx.Client() as client:
for _ in range(3):
cookies = self.custom_model.request_obj.cookies
resp = client.post(url=api_get_user_details, json=payload,
cookies=cookies)
logger.debug(f"Resp Code:{resp.status_code}")
if resp.status_code in CommonStatusCode.SUCCESS_CODES:
_response = resp.json()
logger.debug(f"MetaData Response:{_response}")
return _response.get("data", {})
time.sleep(3)
import pandas as pd
from scripts.constants.app_constants import CustomKeys
from scripts.db import PeriodicData, mongo_client
from scripts.logging.logging import logger
from scripts.utils.data_processor import ProcessData
class CustomImplementations:
def __init__(self, project_id=None):
self.processor = ProcessData(project_id=project_id)
self.periodic_conn = PeriodicData(mongo_client, project_id=project_id)
def form_data_df(self,
data_for_day,
tz, current_day=False):
try:
day_df = pd.DataFrame(columns=['tag', 'time', 'values']) if not current_day else pd.DataFrame(
columns=['tag', 'time', 'values', 'default'])
for each_time in data_for_day:
val = pd.DataFrame(each_time)
val["ts"] = self.processor.convert_series_format(val["ts"], tz, "%H:%M")
val = val.reset_index().rename(columns={"index": "tag", "ts": "time"})
day_df = pd.concat([day_df, val], ignore_index=True)
return day_df
except Exception as e:
logger.error(f"Error in custom implementation in form_data_df, {e}")
def relative_day(self,
relative_data,
attribute,
form_df,
tz, ):
try:
relative_day_base = self.form_data_df(relative_data, tz)
round_relative_day = self.processor.round_off(relative_day_base, "values")
relative_day_with_attr = self.processor.merge_with_another_df(form_df, round_relative_day,
merge_on=['tag', 'time'])
# Case: No next_day attribute, Handle: empty df returned to prevent overwriting current_day df
if attribute not in relative_day_with_attr:
base_df = pd.DataFrame(columns=['tag', 'time', 'values', 'previous_day', 'next_day', 'default'])
return base_df
relative_day_df = relative_day_with_attr[relative_day_with_attr[attribute] == "true"]
return relative_day_df
except Exception as e:
logger.error(f"Error in custom implementation of relative_day, {e}")
def default_day(self,
record_in_db,
attribute,
default_value,
form_df,
tz, ):
try:
relative_day_base = self.form_data_df(record_in_db, tz)
round_relative_day = self.processor.round_off(relative_day_base, "values")
relative_day_with_attr = self.processor.merge_with_another_df(form_df, round_relative_day,
merge_on=['tag'])
if attribute not in relative_day_with_attr:
base_df = pd.DataFrame(columns=['tag', 'time', 'values', 'previous_day', 'next_day', 'default'])
return base_df
relative_day_df = relative_day_with_attr[relative_day_with_attr[attribute] == default_value]
return relative_day_df
except Exception as e:
logger.error(f"Error in custom implementation of last_value, {e}")
@staticmethod
def merge_relative(*args):
try:
merged_df = pd.concat(args).drop_duplicates(['tag', 'time', 'previous_day', 'next_day'],
keep='last')
field_props = merged_df.set_index("prop")["values"].to_dict()
return field_props
except Exception as e:
logger.error(f"Error in custom implementation of merge_relative, {e}")
def month_to_date(self, step_id, to_date, form_df):
if "calculate_mtd" in form_df:
mtd_df = form_df[form_df['calculate_mtd'] == "true"]
ref_keys = mtd_df['mtd_on_key'].to_dict()
mtd_list = self.periodic_conn.find_mtd(step_id, to_date, ref_keys)
if not mtd_list:
return {}
mtd = mtd_list[0]
mtd.pop("_id", None)
if {CustomKeys.ACTUAL_PRODUCTION_MTD, CustomKeys.CAPACITY_FOR_SHIFT_SUMMARY_MTD}.issubset(set(mtd)):
ope_calculation = mtd[CustomKeys.ACTUAL_PRODUCTION_MTD] / mtd[
CustomKeys.CAPACITY_FOR_SHIFT_SUMMARY_MTD] * 100
if mtd[CustomKeys.CAPACITY_FOR_SHIFT_SUMMARY_MTD] != 0:
mtd.update({CustomKeys.OPE_MTD: ope_calculation})
else:
mtd.update({CustomKeys.OPE_MTD: 0})
mtd.update({CustomKeys.PRODUCTION_LOSS_MTD: 100 - mtd[CustomKeys.OPE_MTD]})
return mtd
return {}
def time_associated(self,
form_df,
data_req,
request_data,
next_record,
prv_record,
latest_record):
try:
form_df = form_df[form_df['time_associated'] == "true"].reset_index().rename(
columns={"index": "prop"})
form_df_time = form_df.copy()
final_df = self.form_data_df(data_req, request_data.tz, current_day=True)
rounded_df = self.processor.round_off(final_df, "values")
current_day = self.processor.merge_with_another_df(form_df_time, rounded_df, merge_on=['tag', 'time'])
if "next_day" in current_day:
current_day = current_day[current_day['next_day'] != "true"]
if "previous_day" in current_day:
current_day = current_day[current_day['previous_day'] != "true"]
if "default" in current_day:
current_day = current_day[current_day['default'] != "true"]
# Add field data with attributes as custom properties
next_day_df = pd.DataFrame(columns=['tag', 'time', 'values', 'next_day'])
prev_day_df = pd.DataFrame(columns=['tag', 'time', 'values', 'previous_day'])
default_data = pd.DataFrame(columns=['tag', 'time', 'values', 'default'])
if all([len(next_record) == 0, len(prv_record) == 0, len(latest_record) == 0]):
field_props = current_day.set_index("prop")["values"].to_dict()
return field_props
if len(next_record) != 0:
next_day_df = self.relative_day(next_record, "next_day", form_df, request_data.tz)
if len(prv_record) != 0:
prev_day_df = self.relative_day(prv_record, "previous_day", form_df, request_data.tz)
if len(latest_record) != 0:
default_data = self.default_day(latest_record, "default", "last_value", form_df, request_data.tz)
field_props = self.merge_relative(default_data, current_day, next_day_df, prev_day_df)
return field_props
except Exception as e:
logger.error(f"Error in custom implementation time_associated, {e}")
def get_data_dfs(self,
form_df,
data_req,
request_data,
next_record):
try:
form_df = form_df[form_df['time_associated'] == "true"].reset_index().rename(
columns={"index": "prop"})
form_df_time = form_df.copy()
final_df = self.form_data_df(data_req, request_data.tz)
rounded_df = self.processor.round_off(final_df, "values")
current_day = self.processor.merge_with_another_df(form_df_time, rounded_df, merge_on=['tag', 'time'])
if "next_day" in current_day:
current_day = current_day[current_day['next_day'] != "true"]
# Add field data with attributes as next_day
if next_record:
next_day_df = self.relative_day(next_record, "next_day", form_df, request_data.tz)
else:
next_day_df = pd.DataFrame(columns=["time_associated", "time", "tag", "next_day", "values", "prop"])
return current_day, next_day_df
except Exception as e:
logger.error(f"Error in custom implementation time_associated, {e}")
import pandas as pd
import requests
from fastapi import Request
from scripts.config.app_configurations import PathToServices
from scripts.constants import StatusCodes
from scripts.constants.api import DataEngineEndpoints
from scripts.core.engine.custom_implementations import CustomImplementations
from scripts.core.schemas.forms import SaveForm, TasksInfoList
from scripts.db import PeriodicData, TaskInstanceData, TaskInstance
from scripts.db import mongo_client
from scripts.logging.logging import logger
from scripts.utils.common_utils import CommonUtils
from scripts.utils.data_processor import ProcessData
class DataEngine:
def __init__(self, project_id=None):
self.common_utils = CommonUtils(project_id=project_id)
self.periodic_conn = PeriodicData(mongo_client, project_id=project_id)
self.tasks_data = TaskInstanceData(mongo_client, project_id=project_id)
self.tasks = TaskInstance(mongo_client, project_id=project_id)
self.processor = ProcessData(project_id=project_id)
self.custom_imp = CustomImplementations(project_id=project_id)
def get_iot_param(self, form_info, form_data, date, tz, request_obj: Request):
try:
tag_dict = dict()
for _name, props in form_info.items():
current_properties = set(props.keys())
if _name not in form_data and {"time", "time_associated", "tag", "manual_entry"}.intersection(
current_properties) in [{"tag"}, {"tag", "manual_entry"}]:
tag_dict.update({_name: props["tag"]})
if len(tag_dict) == 0:
return form_data
iot_values = self.get_tag_values(set(tag_dict.values()), for_date=date, tz=tz, request_obj=request_obj)
if not iot_values:
return form_data
returned_data = {
_name: round(iot_values[tag], 2) if isinstance(iot_values[tag], (int, float)) else iot_values[tag] for
_name, tag in tag_dict.items() if
tag in iot_values.keys()}
form_data.update(returned_data)
return form_data
except Exception as e:
logger.error("Failed in get_iot_param", e)
def data_for_next_day(self, date, step_id, form_df, property_name, relative_day=1):
try:
if property_name not in form_df:
return list()
next_date_in_format = self.common_utils.get_next_date(date, "yyyy-MM-dd", relative_day)
next_record = self.periodic_conn.find_by_date_and_step(next_date_in_format, step_id)
if not next_record.data:
return list()
return next_record.data
except Exception as e:
logger.error("Failed in data_for_next_day", e)
def get_last_value(self, step_id, today, form_df):
try:
if "default" not in form_df:
return []
form_df = form_df[form_df['default'] == "last_value"]
default_tags = list(form_df["tag"]) if "tag" in form_df.columns else []
last_record = self.periodic_conn.find_by_latest_data(step_id, today, default_tags)
if not last_record:
return []
return [last_record]
except Exception as e:
logger.error("Failed in get_last_value", e)
return []
def get_data_for_date(self, request_data: SaveForm, step_id, field_props, date, datetime_obj):
try:
if not field_props:
return dict()
form_df = pd.DataFrame.from_dict(field_props, orient='index')
current_record = self.periodic_conn.find_by_date_and_step(date, step_id)
prv_record = self.data_for_next_day(date, step_id, form_df, "previous_day", -1)
next_record = self.data_for_next_day(date, step_id, form_df, "next_day", 1)
latest_record = self.get_last_value(step_id, datetime_obj, form_df)
month_to_date = self.custom_imp.month_to_date(step_id, datetime_obj, form_df)
current_record.manual_data.update(month_to_date)
if not any([current_record.data, current_record.manual_data, prv_record, next_record, latest_record]):
return dict()
if not any([current_record.data, prv_record, next_record, latest_record]) and current_record.manual_data:
return current_record.manual_data
if all(["time_associated" not in form_df, latest_record, current_record.manual_data]):
return current_record.manual_data
if "time_associated" in form_df:
field_props = self.custom_imp.time_associated(form_df, current_record.data, request_data, next_record,
prv_record, latest_record)
field_props.update(current_record.manual_data)
return field_props
except Exception as e:
logger.error("Failed in get_data_for_date", e)
raise
def get_current_and_next_df(self, request_data: SaveForm, field_props, next_record, record):
try:
base_df = pd.DataFrame(columns=["time_associated", "time", "tag", "next_day", "values", "prop", "manual"])
form_df = pd.DataFrame.from_dict(field_props, orient='index')
if not record.data:
return base_df, base_df, form_df, dict()
data_req = record.data
if "time_associated" in form_df:
current, next_day = self.custom_imp.get_data_dfs(form_df, data_req, request_data, next_record)
return current, next_day, form_df, record.manual_data
except Exception as e:
logger.error("Failed in get_data_for_date", e)
raise
@staticmethod
def get_tag_values(tag_list, request_obj: Request,
ignore_stale=False,
for_date=None,
last_data=True,
tz="Asia/Kolkata",
from_time=None,
to_time=None):
try:
if not tag_list:
return None
cookies = request_obj.cookies
tag_list = [x for x in tag_list if x]
tag_json = dict(tag_list=list(tag_list),
tz=tz)
if for_date:
tag_json.update(filter_by_date=for_date)
elif all([from_time, to_time, not last_data]):
tag_json.update(from_time=from_time, to_time=to_time)
response = requests.post(
f"{PathToServices.DATA_ENGINE}{DataEngineEndpoints.api_iot_param}"
f"?last_data={last_data}&ignore_stale={ignore_stale}", json=tag_json,
timeout=30, cookies=cookies)
status = response.status_code
if status == 404:
raise ModuleNotFoundError
content = response.json()
if "data" not in content or not content["data"]:
return None
elif status not in StatusCodes.SUCCESS or content["status"] != "success" or not content["data"]:
logger.debug(f"Content returned: {content}")
logger.error("Error Encountered: Communication to Data engine was unsuccessful")
return None
logger.info("Communication to Data engine was successful, response content: ", content)
values = content["data"]["values"]
if isinstance(values, list):
values = values[0]
return values
except requests.exceptions.ReadTimeout:
raise TimeoutError(f"Request Time out on IOT param call")
except Exception as e:
logger.error(f"Error Encountered while contacting Data Engine, {e}")
raise
def get_tasks_from_logbooks(self, logbook_list):
try:
tasks_list = self.tasks.find_by_logbooks(logbook_list)
return [TasksInfoList(**x) for x in tasks_list]
except Exception as e:
logger.error(f"Error Encountered in get_tasks_from_logbooks, {e}")
raise
def get_data_for_task(self, task_id):
try:
stage_list = self.tasks_data.find_data_by_task_id(task_id)
return stage_list
except Exception as e:
logger.error(f"Error Encountered in get_tasks_from_logbooks, {e}")
raise
This diff is collapsed.
This diff is collapsed.
from copy import deepcopy
from scripts.constants.app_constants import SubmitAction
from scripts.constants.stage_constants import StageConstants
from scripts.db import mongo_client, TaskInstance, Constants, StepCollection, LogbookLinkInfo, Workflow, WorkflowSchema
from scripts.db.common_aggregates import CommonAggregates
from scripts.logging.logging import logger
class StageNavigation:
def __init__(self, project_id=None):
self.logbook_links_conn = LogbookLinkInfo(mongo_client=mongo_client, project_id=project_id)
self.tasks_conn = TaskInstance(mongo_client=mongo_client, project_id=project_id)
self.const_conn = Constants(mongo_client=mongo_client)
self.step_conn = StepCollection(mongo_client=mongo_client, project_id=project_id)
self.common_agg = CommonAggregates()
self.workflow_conn = Workflow(mongo_client=mongo_client, project_id=project_id)
def logbook_links(self, task_data, final_dict):
try:
logbook_links = self.logbook_links_conn.find_by_logbook_id(logbook_id=task_data.get('logbook_id'))
for each in logbook_links.external_links:
links_data = dict(label=each.get("display_title", ""),
link_type=each.get("link_type", ""),
linked_to=each.get("linked_to", ""),
menu_placement=each.get("menu_placement", ""),
type="external_link")
if each["menu_placement"] in final_dict:
final_dict[each["menu_placement"]].append(links_data)
else:
final_dict.update({each["menu_placement"]: [links_data]})
except Exception as e:
logger.exception(f"Error in logbook_links def: {e}")
def get_actions(self, workflow_permissions, nav_type, steps, mobility=False, ):
try:
button_view = self.const_conn.find_constant(_type="button_view_with_permissions",
filter_dict={"_id": 0})
button_view_dict, properties_dict, validate_step_dict = {}, {}, {}
button_properties = {}
for data in button_view.data:
button_view_dict.update({data.get("action"): data.get("button_label")})
button_properties.update({data.get("action"): data})
step_data = self.step_conn.get_data_by_aggregate(self.common_agg.get_step_details(steps=steps))
step_data = step_data[0] if step_data else dict()
permissions = []
for permission in workflow_permissions:
if "permissions" in permission:
permissions.extend(permission.get("permissions", []))
permissions = permissions if bool(permissions) else []
permissions = list(set(permissions))
permissions_dict = list()
# Add Mark stage complete button
if nav_type == "left" and not mobility:
permissions_dict.append(StageConstants.mark_as_completed)
if SubmitAction.save in permissions:
permissions.append(SubmitAction.refresh)
permissions = sorted(list(set(permissions)))
for permission in permissions:
if permission in button_view_dict and permission != SubmitAction.view:
permissions_temp_dict = {"label": button_view_dict.get(permission),
"value": permission,
**button_properties.get(permission, {})}
permissions_temp_dict.pop("button_label", None)
permissions_temp_dict.pop("permission_label", None)
permissions_temp_dict.pop("action", None)
if permission == SubmitAction.refresh:
permissions_temp_dict.update(action='onlySubmit')
permissions_dict.append(permissions_temp_dict)
return step_data, button_view_dict, permissions_dict
except Exception as e:
logger.exception(f"Error in logbook_links def: {e}")
@staticmethod
def check_permissions(step_permissions):
if SubmitAction.save in step_permissions and SubmitAction.view in step_permissions:
step_permissions.remove(SubmitAction.view)
if SubmitAction.save not in step_permissions:
return True
return False
def get_stages(self, steps, nav_type, user_role, workflow_permissions, stages_dict, stage_status_map,
stage_status_map_mobile, workflow_id, workflow_version, mobility=False):
try:
final_dict = {}
workflow_data: WorkflowSchema = self.workflow_conn.find_by_id(workflow_id, workflow_version)
step_data, button_view_dict, permissions_dict = \
self.get_actions(workflow_permissions, nav_type, steps, mobility=mobility)
for step in steps:
menu_placement_availability = step_data.get(step, {}).get("menu_placement")
if step not in step_data and not menu_placement_availability:
continue
if step_data[step]["menu_placement"] not in final_dict:
final_dict.update({step_data[step]["menu_placement"]: list()})
step_permissions = list()
actions_list = list()
action_values = set()
for permission in workflow_permissions:
if step == permission.get("step_id") and user_role == permission.get("user_role") and bool(
permission.get("permissions")):
step_permissions.extend(permission.get("permissions"))
for item in permission.get("permissions"):
if bool(button_view_dict.get(item)) and item != SubmitAction.view:
actions_list.append({"label": button_view_dict.get(item), "value": item})
action_values.add(item)
step_permissions = list(set(step_permissions))
if not step_permissions:
continue
if SubmitAction.save not in step_permissions and SubmitAction.view not in step_permissions:
continue
if SubmitAction.save in action_values and SubmitAction.refresh not in action_values:
action_values.add(SubmitAction.refresh)
read_only = self.check_permissions(step_permissions)
disabled_actions = list(set(button_view_dict.keys()) - action_values - {SubmitAction.view})
temp_json = {
"stage_id": stages_dict.get(step),
"value": stages_dict.get(step),
"step_id": step,
"label": step_data.get(step).get("display_title"),
"actions": actions_list,
"disabledActions": disabled_actions,
"status": stage_status_map.get(step, False),
"type": "step",
"readOnly": read_only,
"validation": workflow_data.validation.get(step, False)
}
if mobility:
temp_json.update(status=stage_status_map_mobile.get(step, False))
if stage_status_map.get(step, False) and not mobility:
temp_json.update(iconClass=StageConstants.mark_complete_icon,
iconColor=StageConstants.mark_complete_icon_color)
final_dict[step_data[step]["menu_placement"]].append(deepcopy(temp_json))
return final_dict, permissions_dict
except Exception as e:
logger.exception(f"Error in logbook_links def: {e}")
raise
This diff is collapsed.
from scripts.logging.logging import logger
class TaskEngine:
def __init__(self, project_id=None):
pass
@staticmethod
def get_hierarchy(logbook_data, stage_json):
try:
if logbook_data.get("hierarchy_dict"):
logbook_hierarchy = dict(hierarchyLevel=logbook_data.get("hierarchy_level", ""))
logbook_hierarchy |= logbook_data["hierarchy_dict"]
return logbook_hierarchy
elif bool(stage_json.get("task_creation_data", {})) and "hierarchy" in stage_json.get("task_creation_data",
{}):
return stage_json["task_creation_data"]["hierarchy"]
except Exception as e:
logger.exception(f"Error Occurred while fetching the hierarchy details from task ,{e}")
raise
@staticmethod
def get_hierarchy_string(hierarchy, site_templates):
try:
hierarchy_id_list = []
for data in site_templates:
if hierarchy and hierarchy.get(data):
if isinstance(hierarchy.get(data), dict):
hierarchy_id_list.append(hierarchy.get(data).get("value"))
else:
hierarchy_id_list.append(hierarchy.get(data))
return '$'.join(hierarchy_id_list)
except Exception as e:
logger.exception(f"Error Occurred while converting to hierarchy_id from task ,{e}")
raise
from scripts.core.schemas.comments import TagList
from scripts.db import mongo_client
from scripts.db.mongo.ilens_configuration.collections.site_conf import SiteConf
from scripts.logging.logging import logger
class CommentHandler:
def __init__(self, project_id=None):
self.site_conn = SiteConf(mongo_client, project_id=project_id)
def get_tags_list(self, request_data: TagList):
try:
type(request_data)
hierarchy = request_data.hierarchy
if not hierarchy:
return list()
hierarchy_level = hierarchy.get("hierarchyLevel")
site_id = hierarchy.get("site")
hierarchy_details = self.site_conn.find_site_by_site_id(site_id=site_id, filter_dict={"_id": 0})
if not hierarchy_details:
return list()
tags = list()
if hierarchy_level == "site":
site_info = hierarchy_details.get("site_info")
tags = site_info.get("tags", [])
else:
details = hierarchy_details.get(hierarchy_level)
hierarchy_id = f"{hierarchy_level}_id"
for data in details:
if data[hierarchy_id] == hierarchy.get(hierarchy_level):
tags = data.get("tags", [])
return tags
except Exception as e:
logger.error(f"Exception while listing tags {str(e)}")
raise
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
from typing import Optional
from pydantic import BaseModel
class UserDataEntryRecord(BaseModel):
type: str
user_id: str
user_name: str
ip_address: str
date_time: int
tag_time: Optional[int]
source: str
previous_value: Optional[str] = ""
new_value: str
property_name: Optional[str] = ""
tag: Optional[str]
task_id: Optional[str] = ""
step_id: str
stage_id: Optional[str] = ""
project_id: Optional[str] = ""
action_status: str = "success"
error_logs: Optional[str]
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment