Commit d727f672 authored by tarun2512's avatar tarun2512

first commit

parent 040a5998
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectModuleManager">
<modules>
<module fileurl="file://$PROJECT_DIR$/.idea/tast_builder_sdk.iml" filepath="$PROJECT_DIR$/.idea/tast_builder_sdk.iml" />
</modules>
</component>
</project>
\ No newline at end of file
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2023 Tarun Madamanchi
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
# tast_builder_sdk # Task Buildern-SDK Readme
## Introduction
Welcome to the TaskBuilder-SDK repository! This software development kit (SDK) is designed to help you create stunning visualizations for your data and applications. Whether you are building a widget, a scientific widget, or simply want to add interactive graphs and charts to your project, this SDK has got you covered.
## Getting Started
To get started with the Visualization-SDK, follow these steps:
1. Clone this repository to your local machine using the following command:
```shell
git clone https://gitlab-pm.knowledgelens.com/noureen.taj/visualization-sdk
```
2. Navigate to the SDK directory:
```shell
cd visualization-sdk
```
3. Install the required dependencies. You can use a virtual environment to manage your dependencies:
```shell
virtualenv venv
source venv/bin/activate
pip install -r requirements.txt
```
4. Now, you're ready to use the Visualization-SDK to create beautiful visualizations!
## Building a Wheel File
Here's how you can create a wheel file from the SDK:
1. Ensure you have the `setuptools` and `wheel` packages installed. If you don't have them, you can install them using pip:
```shell
pip install setuptools wheel
```
2. Navigate to the root directory of the SDK (where `setup.py` is located):
```shell
cd visualization-sdk
```
3. Build the wheel distribution:
```shell
python setup.py sdist bdist_wheel
```
4. After running the command, you will find a `dist` directory in your project containing the wheel file. The file will have a `.whl` extension and will be named something like `visualization-sdk-1.0.0-py3-none-any.whl`.
## Installing the Wheel File
To install the wheel file you just created, you can use pip:
```shell
pip install path/to/Visualization_SDK-1.0.0-py3-none-any.whl
```
Replace `path/to/` with the actual path to your wheel file.
## Usage
Please refer to the documentation and examples provided in this repository to learn how to use the Visualization-SDK effectively. You'll find detailed information on creating various types of visualizations and integrating them into your projects.
## Contribution
We welcome contributions from the community! If you find any issues or have ideas for improvements, please open an issue or submit a pull request. Check out our [contribution guidelines](CONTRIBUTING.md) for more information.
## Contact
If you have any questions or need assistance, you can reach out to us at [noureen.taj@rockwellautomation.com](mailto:noureen.taj@rockwellautomation.com).
Happy visualizing with the Visualization-SDK! 🚀
::: connectors.services.logbooks.logbooks.Logbooks
::: connectors.services.steps.steps.Steps
::: connectors.services.tasks.tasks.Tasks
::: connectors.services.workflows.workflows.Workflows
site_name: Task Builder SDK
theme:
name: "material"
plugins:
- mkdocstrings:
handlers:
python:
paths: [tb_sdk]
options:
show_source: false
- search
nav:
- Tasks: 'tasks.md'
- Logbooks: 'logbooks.md'
- Workflows: 'workflows.md'
- Steps: 'steps.md'
\ No newline at end of file
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "TaskBuildersdk"
version = "1.0.0"
description = "sdk for Task Builder module"
readme = "README.md"
authors = [{ name = "Unifytwin", emails = ["madhuri.penikalapati@knowledgelens.com", "tarun.madamanchi@knowledgelens.com"] }]
license = { file = "LICENSE" }
dependencies = [
"PyJWT==2.4.0",
"cryptography==39.0.0",
"httpx",
"jwt-signature-validator==0.0.1",
"kafka-python==2.0.2",
"lttb==0.3.1",
"num2words",
"num2words==0.5.12",
"orjson==3.9.1",
"paho-mqtt==1.5.0",
"pandas==1.5.3",
"pendulum==2.1.2",
"pre-commit==2.20.0",
"psycopg2-binary==2.9.6",
"pycryptodome==3.16.0",
"pydantic~=1.10.4",
"pymongo~=3.13.0",
"pypika",
"python-dotenv~=0.21.0",
"pytz==2023.3",
"redis==3.5.3",
"ujson==5.8.0",
"uvicorn==0.17.5",
"word2number",
"shortuuid==1.0.11",
"sqlalchemy==1.4.46"
]
requires-python = ">=3.9"
[tool.black]
line-length = 120
[tool.isort]
profile = "black"
[tool.ruff]
select = [
"E", # pycodestyle errors
"W", # pycodestyle warnings
"F", # pyflakes
# "I", # isort
"C", # flake8-comprehensions
"B", # flake8-bugbear
]
ignore = [
"E501", # line too long, handled by black
"B008", # do not perform function calls in argument defaults
"C901", # too complex
"E402",
"B904",
"B905",
"B009"
]
[tool.ruff.per-file-ignores]
"__init__.py" = ["F401"]
cryptography==39.0.0
httpx
jwt-signature-validator==0.0.1
kafka-python==2.0.2
lttb==0.3.1
num2words
num2words==0.5.12
orjson==3.9.1
paho-mqtt==1.5.0
pandas==1.5.3
pendulum==2.1.2
pre-commit==2.20.0
psycopg2-binary==2.9.6
pycryptodome==3.16.0
pydantic~=1.10.4
PyJWT==2.4.0
pymongo~=3.13.0
pypika
python-dotenv~=0.21.0
pytz==2023.3
redis==3.5.3
shortuuid==1.0.11
sqlalchemy==1.4.46
ujson==5.8.0
uvicorn==0.17.5
word2number
mkdocs==1.5.3
mkdocstrings-python==1.7.1
mkdocs-material==9.4.3
from setuptools import setup, find_packages
with open("tb_sdk/requirements.txt") as f:
install_requires = f.read().strip().split("\n")
setup(
name="tb-sdk",
version="1.0",
packages=find_packages(),
install_requires=install_requires,
)
import os
import pathlib
import shutil
import sys
from typing import Any, Optional
from dotenv import load_dotenv
from pydantic import BaseSettings, Field, model_validator
load_dotenv()
PROJECT_NAME = "Task Builder SDK"
class _Service(BaseSettings):
MODULE_NAME: str = Field(default="Task Builder SDK")
LOG_LEVEL: str = "INFO"
ENABLE_FILE_LOG: Optional[Any] = False
ENABLE_CONSOLE_LOG: Optional[Any] = True
@model_validator
def validate_values(cls, values):
values["LOG_LEVEL"] = values["LOG_LEVEL"] or "INFO"
print(f"Logging Level set to: {values['LOG_LEVEL']}")
return values
class _PathToServices(BaseSettings):
FORM_MT: str = Field(None, env="FORM_MT")
WORKFLOW_MT: str = Field(None, env="WORKFLOW_MT")
@model_validator
def validate_values(cls, values):
if not bool(values["FORM_MT"]):
print("FORM MT Environment variable not set")
sys.exit(1)
if not bool(values["WORKFLOW_MT"]):
print("Error, environment variable WORKFLOW MANAGEMENT PROXY not set")
sys.exit(1)
return values
class _DatabaseConstants(BaseSettings):
TEMPLATES_DB: str = Field(default="templates_db")
METADATA_DB: str = Field(default="metadata_db")
ILENS_ASSISTANT_DB: str = Field(default="ilens_assistant")
ILENS_MES_DB: str = Field(default="ilens_mes_db")
ILENS_ASSET_MODEL: str = Field(default="ilens_asset_model")
class _Databases(BaseSettings):
MONGO_URI: Optional[str]
POSTGRES_URI: Optional[str]
USE_POSTGRES: Optional[bool]
MAINTENANCE_URI: Optional[str] = "maintenance_logbook_qa"
MAINTENANCE_DB_URI: Optional[str]
KAIROS_URI: Optional[str]
REDIS_URI: Optional[str]
REDIS_LOGIN_DB: Optional[int] = 9
REDIS_USER_ROLE_DB: Optional[int] = 21
REDIS_HIERARCHY_DB: Optional[int] = 7
REDIS_PROJECT_DB: Optional[int] = 18
PG_ASSISTANT_DB: Optional[str] = "ilens_assistant"
PG_SCHEMA: Optional[str] = "public"
PG_REMOVE_PREFIX: bool = False
@model_validator
def validate_values(cls, values):
if not values["MONGO_URI"]:
print("Error, environment variable MONGO_URI not set")
sys.exit(1)
if not values["MAINTENANCE_URI"]:
print("Environment variable MAINTENANCE_URI not set, proceeding without Postgres Support")
sys.exit(1)
if not values["KAIROS_URI"]:
print("KAIROS_URL env variables missing, continuing without Kafka/Kairos support")
if not values["POSTGRES_URI"]:
print("Environment variable POSTGRES_URI not set, proceeding without Postgres Support")
sys.exit(1)
values["MAINTENANCE_DB_URI"] = f"{values['POSTGRES_URI']}/{values['MAINTENANCE_URI']}"
return values
class _KeyPath(BaseSettings):
KEYS_PATH: Optional[pathlib.Path] = Field("data/keys")
PUBLIC: Optional[pathlib.Path]
PRIVATE: Optional[pathlib.Path]
@model_validator
def assign_values(cls, values):
if not os.path.isfile(os.path.join(values.get("KEYS_PATH"), "public")) or not os.path.isfile(
os.path.join(values.get("KEYS_PATH"), "private")
):
if not os.path.exists(values.get("KEYS_PATH")):
os.makedirs(values.get("KEYS_PATH"))
shutil.copy(os.path.join("assets", "keys", "public"), os.path.join(values.get("KEYS_PATH"), "public"))
shutil.copy(os.path.join("assets", "keys", "private"), os.path.join(values.get("KEYS_PATH"), "private"))
values["PUBLIC"] = os.path.join(values.get("KEYS_PATH"), "public")
values["PRIVATE"] = os.path.join(values.get("KEYS_PATH"), "private")
return values
Service = _Service()
PathToServices = _PathToServices()
DatabaseConstants = _DatabaseConstants()
DBConf = _Databases()
KeyPath = _KeyPath()
__all__ = [
"PROJECT_NAME",
"Service",
"PathToServices",
"DatabaseConstants",
"DBConf",
"KeyPath",
]
class APIEndPoints:
# Form Management endpoint
render_proxy = "/render"
task_proxy = "/task"
logbook_proxy = "/logbook"
workflow_proxy = "/workflow"
step_proxy = "/step"
api_form = "/form"
api_fetch = "/fetch"
api_list = "/list"
api_get = "/get"
api_list_logbook_data = "/list_logbook_data"
api_list_workflow_data = "/list_workflow_data"
api_list_step_data = "/fetch_step_data"
api_fetch_step_data = "/fetch"
class STATUS:
SUCCESS = "success"
FAILED = "failed"
SUCCESS_CODES = [200, 201]
class CommonKeys:
KEY_USER_ID = "user_id"
KEY_PROCESS_TEMPLATE = "process_template"
KEY_SITE_TEMPLATE = "site_template"
KEY_PROCESS_TEMPLT_ID = "process_templt_id"
KEY_KEY_LIST = "key_list"
KEY_VALUE = "value"
KEY_SITE_TEMPLT_ID = "site_templt_id"
KEY_TYPE = "type"
KEY_LOOKUP = "lookup_name"
KEY_CREATED_BY = "created_by"
KEY_CREATED_TIME = "created_at"
KEY_COMPLETED_AT = "completed_at"
KEY_UPDATED_AT = "updated_by"
KEY_LAST_UPDATED_TIME = "updated_at"
step_states = "step_states"
event_categories = "event_categories"
KEY_REGEX = "$regex"
KEY_OPTIONS = "$options"
class UserCollectionKeys:
KEY_LANGUAGE = "language"
KEY_NAME = "name"
KEY_USER_ID = "user_id"
KEY_PROJECT_ID = "project_id"
KEY_USERNAME = "username"
KEY_USER_ROLE = "userrole"
class UserKeys:
project_specific_keys = [
"project_id",
"AccessLevel",
"userrole",
"access_group_ids",
"landing_page",
"is_app_user",
"product_access",
"app_url",
"location",
"department",
"section",
]
class CommonConstants:
ui = "ui_datetime_format"
utc = "utc_datetime_format"
nsc = "no_special_chars_datetime_format"
umtf = "user_meta_time_format"
__temporary_format__ = "%Y-%m-%dT%H:%M:%S+0530"
__iso_format__ = "%Y-%m-%dT%H:%M:%S%z"
__form_iso_format__ = "%Y-%m-%dT%H:%M:%S.%f%z"
__utc_datetime_format__ = "%Y-%m-%dT%H:%M:%S"
__ui_datetime_format__ = "%Y-%m-%d %H:%M:%S"
__no_special_chars_datetime_format__ = "%Y%m%d%H%M%S"
__user_meta_time_format__ = "%d-%m-%Y %H:%M:%S"
__user_meta_time_format_ws__ = "%d-%m-%Y %H:%M"
__ui_display_datetime_format__ = "%d %b %Y %H:%M"
__ui_display_date_format__ = "%d %b %Y"
__ui_display_date_format_for_task__ = "%d %B %Y"
__time_format__ = "%H:%M:%S"
time_zone = "Asia/Kolkata"
directory_path = "templates/"
UPLOAD_CSV_FILE_PATH = "conf"
task_file_name = "Task Schedule form.xlsx"
task_category = "Task Category"
meta_created_at = "meta.created_at"
login_module = "Login module"
user_names = "User Names"
task_schedule_time = "task schedule time"
end_on = "end on"
task_creation_freq_units = "task creation freq units"
scheduling_required = "scheduling required"
task_creation_frequency = "task creation frequency"
fafa_exclamation_triangle = "fa fa-exclamation-triangle text-warning fs-1"
not_authorized = "Not authorized"
not_permitted = "Not Permitted"
reference_id = "Reference Id"
task_description = "Task Description"
logbook_name = "Logbook Name"
assign_to = "Assign To"
schedule_on = "Scheduled On"
open_tickets = "Open tickets"
from tb_sdk.config import DatabaseConstants
class DatabaseNames:
ilens_configuration = DatabaseConstants.METADATA_DB
ilens_assistant = DatabaseConstants.ILENS_ASSISTANT_DB
ilens_mes = DatabaseConstants.ILENS_MES_DB
ilens_asset_model = DatabaseConstants.ILENS_ASSET_MODEL
class CollectionNames:
static_content = "static_content"
sms_gateway = "sms_gateway"
form_props = "form_props"
user_project = "user_project"
scheduled_info = "scheduled_info"
forms = "forms"
asset_overview = "assets"
unique_id = "unique_id"
user = "user"
access_group = "access_group"
materials = "materials"
lookup_table = "lookup_table"
template_category = "template_category"
step_category = "step_category"
constants = "constants"
workflows = "workflows"
workflow_permissions = "workflow_permissions"
triggers = "triggers"
product_master = "product_master"
periodic_data = "periodic_data"
project_remarks = "project_remarks"
action_templates = "action_templates"
user_role = "user_role"
shift_details = "shift_details"
site_conf = "site_conf"
customer_projects = "customer_projects"
logbook = "logbook"
logbook_links = "logbook_links"
job_list = "job_list"
schedule_metadata = "schedule_metadata"
tasks = "task_info"
task_instance = "task_instances"
task_instance_data = "task_instance_data"
steps = "steps"
static = "static"
trigger_steps = "trigger_steps"
trigger_templates = "trigger_templates"
shifts = "shifts"
device_model = "device_model"
asset_model_details = "asset_model_details"
rule_targets = "rule_targets"
process_conf = "process_conf"
hierarchy_details = "hierarchy_details"
class CustomerProjectKeys:
KEY_CUSTOMER_PROJECT_ID = "customer_project_id"
KEY_CUSTOMER_PROJECT_NAME = "customer_project_name"
KEY_SITE_TEMPLT_ID = "site_templt_id"
KEY_PROCESS_TEMPLT_ID = "process_templt_id"
class UserRoleCollectionKeys:
KEY_USER_ROLE_ID = "user_role_id"
KEY_PROJECT_ID = "project_id"
class CommonStatusCode:
SUCCESS_CODES = (
200,
201,
204,
)
\ No newline at end of file
class Secrets:
LOCK_OUT_TIME_MINS = 30
leeway_in_mins = 10
unique_key = "45c37939-0f75"
token = "8674cd1d-2578-4a62-8ab7-d3ee5f9a"
issuer = "ilens"
alg = "RS256"
signature_key = "kliLensKLiLensKL"
signature_key_alg = ["HS256"]
class Captcha:
cookie_encryption_private_key = "#ilenskey@rock1#"
captcha_cookie_encryption_key = "UnifyTwin@r@ck1$"
from sqlalchemy import JSON, Column, Float, Integer, Text
from tb_sdk.connectors.db.psql.databases import Base
class TicketEntryTb(Base):
__tablename__ = "ticket_entry"
workflow_id = Column(Text)
template_id = Column(Text)
ticket_title = Column(Text)
site_hierarchy = Column(Text)
data = Column(JSON)
user_id = Column(Text)
created_on = Column(Float(precision=20, decimal_return_scale=True))
last_updated = Column(Float(precision=20, decimal_return_scale=True))
expiry_date = Column(Float(precision=20, decimal_return_scale=True))
assign_to = Column(Text)
id = Column(Integer, primary_key=True, autoincrement=True)
event_type = Column(Text)
event_status = Column(Text)
project_id = Column(Text)
from tb_sdk.config import DBConf
from tb_sdk.connectors.utils.mongo_util import MongoConnect
mongo_client = MongoConnect(uri=DBConf.MONGO_URI)()
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.constants import Constants
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.customer_projects import (
CustomerProjects,
)
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.user import User
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.user_project import UserProject
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.user_role import UserRole
from typing import Any, Optional
from tb_sdk.connectors.constants import CommonKeys
from tb_sdk.connectors.constants.app_constants import CollectionNames, DatabaseNames
from tb_sdk.connectors.db.mongo.schema import MongoBaseSchema
from tb_sdk.connectors.utils.mongo_util import MongoCollectionBaseClass
class ConstantsSchema(MongoBaseSchema):
"""
This is the Schema for the Mongo DB Collection.
All datastore and general responses will be following the schema.
"""
type: str
data: Any
tableData: Optional[Any]
class Constants(MongoCollectionBaseClass):
def __init__(self, mongo_client):
super().__init__(mongo_client, database=DatabaseNames.ilens_configuration, collection=CollectionNames.constants)
@property
def key_type(self):
return CommonKeys.KEY_TYPE
def find_constant_by_dict(self, _type):
"""
The following function will give one record for a given set of
search parameters as keyword arguments
:param _type:
:return:
"""
record = self.find_one(query={self.key_type: _type})
return dict(record) if record else {}
def find_constant(self, _type, filter_dict=None):
"""
The following function will give one record for a given set of
search parameters as keyword arguments
:param _type:
:param filter_dict:
:return:
"""
query = {self.key_type: _type}
record = self.find_one(query=query, filter_dict=filter_dict)
return ConstantsSchema(**record) if record else None
def find_constant_dict(self, _type, filter_dict=None):
"""
The following function will give one record for a given set of
search parameters as keyword arguments
:param _type:
:param filter_dict:
:return:
"""
query = {self.key_type: _type}
record = self.find_one(query=query, filter_dict=filter_dict)
return dict(record) if record else {}
def insert_one_constant(self, data):
"""
The following function will insert one tag in the
tags collections
:param self:
:param data:
:return:
"""
return self.insert_one(data)
def find_constant_by_content(self, content_type):
"""
The following function will give one record for a given set of
search parameters as keyword arguments
"""
query = {"content_type": content_type}
search_option = {"data": 1}
record = self.find_one(query=query, filter_dict=search_option)
return record or {}
def get_aggregate_find_constants_dict(self, query_json: list, filter_dict=None):
if filter_dict:
filter_dict1 = {"$project": filter_dict}
query_json.append(filter_dict1)
record = list(self.aggregate(query_json))
return record[0] if record[0] else {}
from typing import Dict, Optional
from tb_sdk.connectors.constants.app_constants import (
CollectionNames,
CustomerProjectKeys,
DatabaseNames,
)
from tb_sdk.connectors.db.mongo.schema import MongoBaseSchema
from tb_sdk.connectors.utils.mongo_util import MongoCollectionBaseClass
class CustomerProjectsSchema(MongoBaseSchema):
"""
This is the Schema for the Mongo DB Collection.
All datastore and general responses will be following the schema.
"""
customer_project_name: Optional[str]
description: Optional[str]
site_templt_id: Optional[str]
logo_name: Optional[str]
logo_url: Optional[str]
process_templt_id: Optional[str]
update_details: Optional[Dict]
user_id: Optional[str]
customer_project_id: Optional[str]
product_encrypted: Optional[bool]
class CustomerProjects(MongoCollectionBaseClass):
def __init__(self, mongo_client):
super().__init__(
mongo_client, database=DatabaseNames.ilens_configuration, collection=CollectionNames.customer_projects
)
@property
def key_customer_project_id(self):
return CustomerProjectKeys.KEY_CUSTOMER_PROJECT_ID
@property
def key_customer_project_name(self):
return CustomerProjectKeys.KEY_CUSTOMER_PROJECT_NAME
def find_project(self, project_id=None, project_name=None, filter_dict=None):
"""
The following function will give one record for a given set of
search parameters as keyword arguments
:param filter_dict:
:param project_id:
:param project_name
:return:
"""
query = {}
if project_id:
query[self.key_customer_project_id] = project_id
if project_name:
query[self.key_customer_project_name] = project_name
record = self.find_one(query=query, filter_dict=filter_dict)
return CustomerProjectsSchema(**record).dict() if record else {}
def find_project_by_query(self, query, filter_dict=None):
if record := self.find(query=query, filter_dict=filter_dict):
return record
return []
def insert_one_project(self, data):
"""
The following function will insert one project in the
customer_projects collections
:param self:
:param data:
:return:
"""
return self.insert_one(data)
def update_one_project(self, project_id, data):
query = {self.key_customer_project_id: project_id}
return self.update_one(query=query, data=data)
def delete_one_project(self, project_id):
if project_id:
query = {self.key_customer_project_id: project_id}
return self.delete_one(query)
else:
return False
def get_project_data_by_aggregate(self, query: list):
return list(self.aggregate(pipelines=query))
from typing import List, Optional
from tb_sdk.connectors.constants.app_constants import CollectionNames, DatabaseNames
from tb_sdk.connectors.db.mongo.schema import MongoBaseSchema
from tb_sdk.connectors.utils.mongo_util import MongoCollectionBaseClass
class ErrorCodes(MongoBaseSchema):
key: Optional[str]
type: Optional[str]
values: Optional[List] = []
class StaticCollection(MongoCollectionBaseClass):
def __init__(self, mongo_client):
super().__init__(mongo_client, database=DatabaseNames.ilens_configuration, collection=CollectionNames.static)
def find_error_codes(self, _type: str):
if error_codes := self.find_one(query={"type": _type}):
return error_codes["values"]
else:
return error_codes
from typing import Any, Dict, List, Optional
from tb_sdk.connectors.constants import UserCollectionKeys
from tb_sdk.connectors.constants.app_constants import CollectionNames, DatabaseNames
from tb_sdk.connectors.constants import UserKeys
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.user_project import UserProject
from tb_sdk.connectors.db.mongo.schema import MongoBaseSchema
from tb_sdk.connectors.utils.mongo_util import MongoCollectionBaseClass
class UserSchema(MongoBaseSchema):
name: Optional[str]
project_id: Optional[str]
username: Optional[str]
password: Optional[str]
email: Optional[str]
phonenumber: Optional[Any]
userrole: Optional[List[str]]
user_type: Optional[str]
user_id: Optional[str]
AccessLevel: Optional[Dict]
user_access_select_all: Optional[bool]
access_group_ids: Optional[List[str]]
client_id: Optional[str]
created_by: Optional[str]
hmi: Optional[Dict]
encryption_salt: Optional[Dict]
product_encrypted: Optional[bool]
email_preferences: Optional[Dict]
language: Optional[str]
passwordReset: Optional[Dict]
failed_attempts: Optional[int]
is_user_locked: Optional[bool]
last_failed_attempt: Optional[str]
profileImage_name: Optional[str]
profileImage_url: Optional[str]
date_format: Optional[str]
date_time_format: Optional[str]
time_format: Optional[str]
tz: Optional[str]
app_url: Optional[str]
landing_page: Optional[str]
ilens_encrypted: Optional[bool]
class User(MongoCollectionBaseClass):
def __init__(self, mongo_client):
super().__init__(mongo_client, database=DatabaseNames.ilens_configuration, collection=CollectionNames.user)
self.user_project_mongo = UserProject(mongo_client=mongo_client)
@property
def key_username(self):
return UserCollectionKeys.KEY_USERNAME
@property
def key_user_id(self):
return UserCollectionKeys.KEY_USER_ID
@property
def key_language(self):
return UserCollectionKeys.KEY_LANGUAGE
@property
def key_name(self):
return UserCollectionKeys.KEY_NAME
@property
def key_project_id(self):
return UserCollectionKeys.KEY_PROJECT_ID
@property
def key_userrole(self):
return UserCollectionKeys.KEY_USER_ROLE
def get_all_users(self, filter_dict=None, sort=None, skip=0, limit=None, **query):
if users := self.find(filter_dict=filter_dict, sort=sort, skip=skip, limit=limit, query=query):
return list(users)
return []
def find_user(self, user_id):
if user := self.find_one(query={self.key_user_id: user_id}):
return dict(user)
else:
return {}
def find_user_by_param(self, **query):
return UserSchema(**user) if (user := self.find_one(query)) else user
def find_access_site(self, user_id):
if access_site := self.distinct(query_key="AccessLevel.sites.parent_id", filter_json={"user_id": user_id}):
return list(access_site)
return []
def find_by_query_key(self, query_key, user_id):
if access_site := self.distinct(query_key=query_key, filter_json={self.key_user_id: user_id}):
return access_site
return []
def update_one_user(self, user_id, data):
query = {self.key_user_id: user_id}
return self.update_one(query=query, data=data)
def delete_one_user(self, user_id):
return self.delete_one(query={self.key_user_id: user_id})
def users_by_project_and_site(self, user_id, project_id, site_ids):
query = {"$or": [{self.key_user_id: user_id}, {"AccessLevel.sites.parent_id": {"$in": site_ids}}]}
if project_id is not None:
query["$or"].append({self.key_project_id: project_id})
response = self.find(query)
return response or []
def find_user_time_zone(self, user_id):
search_json = {"_id": 0, "tz": 1, "date_format": 1, "time_format": 1, "date_time_format": 1}
return self.find_one(filter_dict=search_json, query={self.key_user_id: user_id})
def distinct_user(self, query_key, filter_json):
query = {self.key_user_id: filter_json}
return self.distinct(query_key=query_key, filter_json=query)
def users_list_by_aggregate(self, query: list):
return self.aggregate(pipelines=query)
def find_user_filter(self, project_id, user_id=None, username=None, filter_dict=None):
query = {}
if user_id:
query[self.key_user_id] = user_id
if username:
query[self.key_username] = username
user = self.find_one(query=query, filter_dict=filter_dict)
if user:
if project_id != user["project_id"]:
user_project_record = self.user_project_mongo.fetch_user_project(
user_id=user.get(self.key_user_id), project_id=project_id
)
if bool(user_project_record):
for item in UserKeys.project_specific_keys:
user[item] = user_project_record.get(item, None)
return UserSchema(**user)
return user
def find_user_by_project_id(self, user_id, project_id):
if user := self.find_one(query={self.key_user_id: user_id, self.key_project_id: project_id}):
return dict(user)
else:
return user
def find_user_data_with_roles(self, roles, project_id):
query = {self.key_userrole: {"$in": roles}, self.key_project_id: project_id}
return response if (response := list(self.find(query))) else []
def find_user_role_for_user_id(self, user_id, project_id):
query = {self.key_user_id: user_id, self.key_project_id: project_id}
filter_dict = {"userrole": 1, "_id": 0}
return self.find_one(query=query, filter_dict=filter_dict)
from tb_sdk.connectors.constants import UserCollectionKeys
from tb_sdk.connectors.constants.app_constants import CollectionNames, DatabaseNames
from tb_sdk.connectors.utils.mongo_util import MongoCollectionBaseClass
class UserProject(MongoCollectionBaseClass):
def __init__(self, mongo_client):
super().__init__(
mongo_client, database=DatabaseNames.ilens_configuration, collection=CollectionNames.user_project
)
@property
def key_username(self):
return UserCollectionKeys.KEY_USERNAME
@property
def key_user_id(self):
return UserCollectionKeys.KEY_USER_ID
@property
def key_language(self):
return UserCollectionKeys.KEY_LANGUAGE
@property
def key_name(self):
return UserCollectionKeys.KEY_NAME
@property
def key_project_id(self):
return UserCollectionKeys.KEY_PROJECT_ID
def fetch_user_project(self, user_id, project_id):
query = {self.key_user_id: user_id, self.key_project_id: project_id}
return self.find_one(query=query)
def list_projects(self, user_id):
query = {self.key_user_id: user_id}
return self.distinct(query_key=self.key_project_id, filter_json=query)
def fetch_users(self, project_id):
query = {self.key_project_id: project_id}
return self.find(query=query)
def insert_one_user(self, data):
"""
The following function will insert one user in the
user collections
:param self:
:param data:
:return:
"""
return self.insert_one(data)
def update_one_user_project(self, data, user_id, project_id):
query = {self.key_user_id: user_id, self.key_project_id: project_id}
return self.update_one(query=query, data=data)
def delete_user(self, user_id):
if user_id:
return self.delete_many(query={self.key_user_id: user_id})
else:
return False
def delete_user_project(self, user_id, project_id):
if user_id:
return self.delete_one(query={self.key_user_id: user_id, self.key_project_id: project_id})
else:
return False
def find_user_role_for_user_id(self, user_id, project_id):
query = {self.key_user_id: user_id, self.key_project_id: project_id}
filter_dict = {"userrole": 1, "_id": 0}
return self.find_one(query=query, filter_dict=filter_dict)
def users_list_by_aggregate(self, query: list):
return self.aggregate(pipelines=query)
from typing import Dict, Optional
from tb_sdk.connectors.constants.app_constants import (
CollectionNames,
DatabaseNames,
UserRoleCollectionKeys,
)
from tb_sdk.connectors.db.mongo.schema import MongoBaseSchema
from tb_sdk.connectors.utils.mongo_util import MongoCollectionBaseClass
class UserRoleSchema(MongoBaseSchema):
type: Optional[str]
user_role_name: Optional[str]
user_role_description: Optional[str]
user_role_id: Optional[str]
project_id: Optional[str]
user_role_permissions: Optional[Dict]
access_levels: Optional[Dict]
default: Optional[bool]
client_id: Optional[str]
product_encrypted: Optional[bool]
permission_status: Optional[bool]
class UserRole(MongoCollectionBaseClass):
def __init__(self, mongo_client):
super().__init__(mongo_client, database=DatabaseNames.ilens_configuration, collection=CollectionNames.user_role)
@property
def key_user_role_id(self):
return UserRoleCollectionKeys.KEY_USER_ROLE_ID
@property
def key_project_id(self):
return UserRoleCollectionKeys.KEY_PROJECT_ID
def find_roles(self):
return access_groups if (access_groups := self.find({})) else []
def find_role_by_project(self, project_id):
query = {self.key_project_id: project_id}
return access_groups if (access_groups := self.find_one(query)) else []
def find_role_by_param(self, **query):
return access_groups if (access_groups := self.find_one(query)) else {}
def find_roles_by_list(self, user_role_id_list, project_id=None):
query = {"$or": [{self.key_user_role_id: {"$in": user_role_id_list}}]}
if self.key_project_id is not None:
query["$or"].append({self.key_project_id: project_id})
return access_groups if (access_groups := self.find(query)) else []
def find_user_role_by_id(self, user_role_id, filter_dict=None):
return self.find_one(query={self.key_user_role_id: user_role_id}, filter_dict=filter_dict)
def update_user_role(self, _id, data):
query = {self.key_user_role_id: _id}
self.update_one(query=query, data=data)
def delete_user_role(self, _id):
if _id:
query = {self.key_user_role_id: _id}
self.delete_one(query=query)
else:
return False
def get_data_by_aggregate(self, query_json: list):
return list(self.aggregate(query_json))
from pydantic import BaseModel
class MongoBaseSchema(BaseModel):
pass
class BaseRequestSchema(BaseModel):
"""
This is base schema for input requests to the Collection Class
"""
from sqlalchemy import Column, MetaData, Table, inspect
from sqlalchemy.schema import DDL
from tb_sdk.config import DBConf
class TableDDL:
def __init__(self, session, table_name):
self.engine = session.get_bind().engine
self.table_name = table_name
def create_new_table(self, data_types: dict):
metadata = MetaData()
table = Table(self.table_name, metadata, schema=DBConf.PG_SCHEMA)
if not inspect(self.engine).has_table(self.table_name, schema=DBConf.PG_SCHEMA):
[
table.append_column(Column(column_name, d_type[0], primary_key=d_type[1]))
for column_name, d_type in data_types.items()
]
metadata.create_all(self.engine)
return
self.alter_add_column(data_types)
def alter_add_column(self, data_types):
try:
metadata = MetaData(bind=self.engine)
existing_table = Table(self.table_name, metadata, autoload=True, schema=DBConf.PG_SCHEMA)
existing_columns = existing_table.columns.keys()
missing_columns = set(data_types.keys()) - set(existing_columns)
if not missing_columns:
return
for col in missing_columns:
column = Column(col, data_types[col][0], primary_key=data_types[col][1])
alter_table_stmt = DDL(
f"ALTER TABLE {self.table_name} ADD COLUMN {column.compile(metadata.bind)} "
f"{column.type.compile(self.engine.dialect)}"
)
with self.engine.connect() as connection:
connection.execute(alter_table_stmt)
except Exception as e:
print(e)
import os
from fastapi import Request
from sqlalchemy import create_engine
from sqlalchemy.engine.reflection import Inspector
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from sqlalchemy.schema import CreateSchema
from sqlalchemy_utils import create_database, database_exists
from tb_sdk.config import DBConf
from tb_sdk.connectors.db.redis_connections import project_details_db
from tb_sdk.connectors.utils.redis_db_name_util import get_db_name
Base = declarative_base()
def get_assistant_db(request_data: Request):
if not DBConf.USE_POSTGRES:
return
project_id = request_data.cookies.get("projectId", request_data.cookies.get("project_id"))
db = (
get_db_name(redis_client=project_details_db, project_id=project_id, database=DBConf.PG_ASSISTANT_DB)
if not DBConf.PG_REMOVE_PREFIX
else DBConf.PG_ASSISTANT_DB
)
uri = f"{DBConf.POSTGRES_URI}/{db}"
engine = create_engine(
f"{uri}",
pool_size=int(os.getenv("PG_POOL_SIZE")),
max_overflow=int(os.getenv("PG_MAX_OVERFLOW")),
)
if not database_exists(engine.url):
create_database(engine.url)
inspector = Inspector.from_engine(engine)
if DBConf.PG_SCHEMA not in inspector.get_schema_names():
engine.execute(CreateSchema(DBConf.PG_SCHEMA, quote=True))
session_local = sessionmaker(autocommit=False, autoflush=False, bind=engine)
db = session_local()
try:
yield db
finally:
db.close()
import redis
from tb_sdk.config import DBConf
login_db = redis.from_url(DBConf.REDIS_URI, db=int(DBConf.REDIS_LOGIN_DB), decode_responses=True)
project_details_db = redis.from_url(DBConf.REDIS_URI, db=int(DBConf.REDIS_PROJECT_DB), decode_responses=True)
user_role_permissions_redis = redis.from_url(DBConf.REDIS_URI, db=int(DBConf.REDIS_USER_ROLE_DB), decode_responses=True)
hierarchy_mapping_db = redis.from_url(DBConf.REDIS_URI, db=int(DBConf.REDIS_HIERARCHY_DB), decode_responses=True)
class ILensErrors(Exception):
"""Generic iLens Error"""
class KairosDBError(ILensErrors):
"""Kairos DB Error"""
class AuthenticationError(ILensErrors):
pass
class DataFrameFormationError(ILensErrors):
"""Raise when there is an error during dataframe formation"""
class IllegalTimeSelectionError(ILensErrors):
pass
class TemplateFormationError(ILensErrors):
pass
class ProductsNotFoundError(ILensErrors):
"""Raise when products matching conditions are not found"""
pass
class DataNotFound(ILensErrors):
"""
Raise when data is not found
"""
class TagDetailsNotFound(Exception):
"""
Raise when tag details are crucial to proceed and meta service returns empty list
"""
class TimeColumnError(ILensErrors):
"""Raise Exception Time Related Column is coming Empty"""
pass
class DuplicateName(ILensErrors):
pass
class InputRequestError(Exception):
pass
class JWTDecodingError(Exception):
pass
class DashboardNotFound(ILensErrors):
pass
class WidgetsNotFound(ILensErrors):
pass
class JobCreationError(Exception):
"""
Raised when a Job Creation throws an exception.
Job Creation happens by adding a record to Mongo.
"""
class TemplateNotFoundError(Exception):
"""
This error is raised when Dashboard/Widget Template is not found
"""
class ChartFormationError(Exception):
pass
class QueryFormationError(Exception):
pass
class UnauthorizedError(Exception):
pass
class RequiredFieldMissing(Exception):
pass
class PostgresDBError(Exception):
pass
class CustomError(Exception):
pass
class ErrorMessages:
UNKNOWN_ERROR = "Error occurred while processing your request."
ERROR001 = "Authentication Failed. Please verify token"
ERROR002 = "Signature Expired"
ERROR003 = "Signature Not Valid"
ERROR004 = "No Data available to form dataframe"
ERROR005 = "Error occurred While forming chart"
ERROR006 = "Data Not Found"
ERROR007 = "Error while reading initial parameters. Ensure Type is set with valid arguments."
ERROR008 = "Failed to add Annotation"
ERROR009 = "Unknown Error while fetching chart data"
ERROR010 = "File not Found"
ERROR011 = "Chart Not Implemented"
ERROR012 = "SQL Query Formation Error"
K_ERROR1 = "Data Not Found in Time series Database"
K_ERROR2 = "Time series Database returned with an error"
K_ERROR3 = "Communication Error with Time series Database"
DF_ERROR1 = "Error occurred while forming Dataframe"
DF_ERROR2 = "Given group-by parameters are invalid"
META_ERROR1 = "Tags not Found in Meta"
SPC_ERRO2 = "Not Enough Data"
REQUEST_ERROR = "Request Error"
SPC_ERR01 = "Data not found in SPC tag"
CHART_ERROR = "Error occurred While forming chart"
COMMON_MESSAGE = "Exception while updating: "
CONNECTION_EXCEPTION = "Exception while closing connection: "
DASHLIST_ERROR = "Error occurred in server while listing dashboards"
required_fields_error = "Required fields have not been filled by user"
from typing import Any, Dict, List, Optional
from pydantic import BaseModel
class FetchLogbookInfo(BaseModel):
logbook_id: str
project_id: Optional[str]
class LogbookListRequest(BaseModel):
tableParams: Optional[Dict]
page: Optional[int]
records: Optional[int]
filters: Optional[Dict]
project_id: Optional[str]
is_deleted: bool = False
timezone: Optional[str] = "Asia/Kolkata"
template_category: Optional[str] = ""
from typing import Dict, List, Optional
from pydantic import BaseModel, Field
class ExternRequest(BaseModel):
url: str
timeout: int
cookies: Optional[Dict]
params: Optional[Dict]
auth: Optional[tuple]
class MetaInfoSchema(BaseModel):
projectId: Optional[str] = ""
project_id: Optional[str] = ""
user_id: Optional[str] = ""
language: Optional[str] = ""
ip_address: Optional[str] = ""
login_token: Optional[str] = Field(alias="login-token")
class Config:
allow_population_by_field_name = True
\ No newline at end of file
from typing import Dict, List, Optional
from pydantic import BaseModel, Field
from tb_sdk.connectors.constants import CommonConstants
class FetchStepSchema(BaseModel):
project_id: Optional[str]
step_id: Optional[str]
step_version: Optional[int]
task_id: Optional[str]
editMode: Optional[bool] = False
class ListStepRequest(BaseModel):
tableParams: Optional[Dict]
timezone: Optional[str] = CommonConstants.time_zone
page: Optional[int]
records: Optional[int]
filters: Optional[Dict]
is_workflow: Optional[bool] = False
project_id: Optional[str]
shift_details: Optional[list] = []
shift_enabled: Optional[bool] = False
from typing import Any, Dict, List, Optional
from pydantic import BaseModel
from tb_sdk.connectors.constants import CommonConstants
class SaveForm(BaseModel):
project_id: Optional[str]
user_id: Optional[str]
user_details: Optional[Dict] = {}
stage_id: Optional[str]
step_id: Optional[str]
submitted_data: Optional[Dict] = {}
components: Optional[List]
stages: Optional[List]
type: Optional[str]
signature_id: Optional[str]
current_status: Optional[str]
date: Optional[int]
task_id: Optional[str]
tz: Optional[str] = CommonConstants.time_zone
template_type: Optional[List] = ["cross_step", "JMR"]
auto_populate_key: Optional[str] = "auto_populate"
triggers: Optional[Dict] = {}
allow_all_manual: Optional[bool] = False
logbook_data: Optional[Dict] = {}
signed_fields: Optional[List] = []
cookies: Optional[Dict]
site_templates: Optional[List]
custom_field: Optional[Dict]
advanced_configuration: Optional[Dict] = {}
hierarchy: Optional[Dict] = {}
class FormRender(BaseModel):
project_id: str
task_id: str
triggers: Optional[dict] = {}
tz: Optional[str] = CommonConstants.time_zone
stage_id: str
project_type: Optional[str]
language: Optional[str]
class TaskDetails(BaseModel):
task_info_id: Optional[str]
task_id: Optional[str]
project_id: str
timezone: Optional[str] = CommonConstants.time_zone
notification_types: Optional[list]
users_and_users_group: Optional[list]
fetch_by_task_data: bool = False
task_details: Optional[Dict] = {}
class TaskListSchema(BaseModel):
project_id: Optional[str]
tableParams: Optional[Dict]
timezone: Optional[str] = CommonConstants.time_zone
page: Optional[int]
records: Optional[int]
filters: Optional[Dict] = {}
dashboard: Optional[bool] = False
task: Optional[bool] = True
endOfRecords: Optional[bool] = False
future_tasks: Optional[bool] = False
mobile: bool = False
shift_wise: Optional[bool] = False
shift_id: Optional[str]
page_type: Optional[str]
filter_count: Optional[int] = 0
tableParams: Optional[Dict]
user_id: Optional[str]
from typing import Dict, List, Optional
from pydantic import BaseModel, Field
class FetchWorkFlowResponse(BaseModel):
project_id: Optional[str]
workflow_id: Optional[str]
workflow_version: Optional[int]
type: Optional[str]
is_deleted: Optional[bool] = False
shift_details: Optional[List] = []
shift_enabled: Optional[bool] = False
class ListWorkflowRequest(BaseModel):
tableParams: Optional[Dict]
page: Optional[int]
records: Optional[int]
filters: Optional[Dict]
project_id: Optional[str]
timezone: Optional[str] = "Asia/Kolkata"
is_deleted: bool = False
import json
import time
from datetime import datetime, timedelta
from typing import Any
import httpx
import pandas as pd
import pytz
import shortuuid
from tb_sdk.config import PathToServices
from tb_sdk.connectors.constants import APIEndPoints
from tb_sdk.connectors.schemas.logbook_schema import FetchLogbookInfo, LogbookListRequest
from tb_sdk.connectors.utils.common_utils import CommonUtils
class Logbooks:
def __init__(self):
self.workflow_task_url = f"{PathToServices.WORKFLOW_MT}{APIEndPoints.logbook_proxy}"
def fetch_logbook_details(self, request_data: FetchLogbookInfo, user_id: str, project_id: str):
"""
Info:
The fetch_logbook_details function is used to fetch the logbook details for a given workflow task.
Usage:
logbooks_obj = Logbooks()
logbooks_obj.fetch_logbook_details(request_data, user_id, project_id)
Args:
request_data (FetchLogbookInfo(Pydantic Model)): The FetchLogbookInfo object containing the logbook_id and project_id.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Response containing logbook details.
"""
try:
common_utils = CommonUtils()
api_url = f"{self.workflow_task_url}{APIEndPoints.api_get}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post", request_cookies=cookies)
except Exception as e:
print(e)
raise
def list_logbooks(self, request_data: LogbookListRequest, user_id: str, project_id: str):
"""
Info:
The list_logbooks function is used to list all the logbooks for a given project.
Usage:
logbooks_obj = Logbooks()
logbooks_obj.list_logbooks(request_data, user_id, project_id)
Args:
request_data (LogbookListRequest(Pydantic Model)): The request data containing the parameters required to list logbooks.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Response containing list of logbook details.
"""
try:
common_utils = CommonUtils()
api_url = f"{PathToServices.WORKFLOW_MT}{APIEndPoints.api_list_logbook_data}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post", request_cookies=cookies)
except Exception as e:
print(e)
raise
import json
import time
from datetime import datetime, timedelta
from typing import Any
import httpx
import pandas as pd
import pytz
import shortuuid
from tb_sdk.config import Service, PathToServices
from tb_sdk.connectors.constants import APIEndPoints
from tb_sdk.connectors.schemas.step_schema import ListStepRequest, FetchStepSchema
from tb_sdk.connectors.utils.common_utils import CommonUtils
class Steps:
def __init__(self):
self.workflow_task_url = f"{PathToServices.WORKFLOW_MT}{APIEndPoints.step_proxy}"
def fetch_step_details(self, request_data: FetchStepSchema, user_id: str, project_id: str):
"""
Info:
The fetch_step_details function is used to fetch the details of a step.
Usage:
step_obj = Steps()
step_obj.fetch_step_details(request_data, user_id, project_id)
Args:
request_data (FetchStepSchema(Pydantic Model)): The request data containing the parameters required to fetch step details.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Response containing step details.
"""
try:
common_utils = CommonUtils()
api_url = f"{self.workflow_task_url}{APIEndPoints.api_fetch_step_data}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post", request_cookies=cookies)
except Exception as e:
print(e)
raise
def list_steps(self, request_data: ListStepRequest, user_id: str, project_id: str):
"""
Info:
The list_steps function is used to list all the steps in a workflow.
Usage:
step_obj = Steps()
step_obj.list_steps(request_data, user_id, project_id)
Args:
request_data (ListStepRequest(Pydantic Model)): The request object containing the parameters required to list steps.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Response containing list of step details.
"""
try:
common_utils = CommonUtils()
api_url = f"{PathToServices.WORKFLOW_MT}{APIEndPoints.api_list_step_data}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post", request_cookies=cookies)
except Exception as e:
print(e)
raise
import json
import time
from datetime import datetime, timedelta
from typing import Any
import httpx
import pandas as pd
import pytz
import shortuuid
from tb_sdk.config import PathToServices
from tb_sdk.connectors.constants import APIEndPoints
from tb_sdk.connectors.errors import InputRequestError, RequiredFieldMissing, ErrorMessages
from tb_sdk.connectors.schemas.tasks_schema import SaveForm, FormRender, TaskDetails, TaskListSchema
from tb_sdk.connectors.utils.common_utils import CommonUtils
class Tasks:
def __init__(self):
self.form_render_url = f"{PathToServices.FORM_MT}{APIEndPoints.render_proxy}"
self.workflow_task_url = f"{PathToServices.WORKFLOW_MT}{APIEndPoints.task_proxy}"
def render_form(self, request_data: FormRender, user_id: str, project_id: str):
"""
Info:
The render_form function is used to render the form.
Usage:
tasks_obj=Tasks()
tasks_obj.render_form(request_data, user_id, project_id)
Args:
request_data (FormRender(Pydantic Model)): The details to render form.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Response containing step metadata and submitted_data
"""
try:
common_utils = CommonUtils()
api_url = f"{self.form_render_url}{APIEndPoints.api_form}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post", request_cookies=cookies)
except Exception as e:
print(e)
raise
def save_form(self, request_data: SaveForm, user_id: str, project_id: str):
"""
Info:
The save_form function is used to save the form data.
Usage:
tasks_obj=Tasks()
tasks_obj.save_form(request_data, user_id, project_id)
Args:
request_data (SaveForm(Pydantic Model)): The SaveForm object containing the form data.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Returns success response as "Form Saved Successfully"
"""
try:
common_utils = CommonUtils()
api_url = f"{self.form_render_url}{APIEndPoints.api_form}?save=True"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post", request_cookies=cookies)
except Exception as e:
print(e)
raise
def fetch_task_details(self, request_data: TaskDetails, user_id: str, project_id: str):
"""
Info:
The fetch_task_details function is used to fetch the details of a task.
Usage:
tasks_obj=Tasks()
tasks_obj.fetch_task_details(request_data, user_id, project_id)
Args:
request_data (TaskDetails(Pydantic Model)): The TaskDetails object containing the task_id and project_id.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Returns task meta details.
"""
try:
common_utils = CommonUtils()
api_url = f"{self.workflow_task_url}{APIEndPoints.api_fetch}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post",
request_cookies=cookies)
except Exception as e:
print(e)
raise
def list_all_tasks(self, request_data: TaskListSchema, user_id: str, project_id: str):
"""
Info:
The list_all_tasks function is used to list all the tasks in a project.
Usage:
tasks_obj=Tasks()
tasks_obj.list_all_tasks(request_data, user_id, project_id)
Args:
request_data (TaskListSchema(Pydantic Model)): The TaskListSchema schema containing details to list all tasks.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Returns all tasks details.
"""
try:
common_utils = CommonUtils()
api_url = f"{self.workflow_task_url}{APIEndPoints.api_list}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post",
request_cookies=cookies)
except Exception as e:
print(e)
raise
import json
import time
from datetime import datetime, timedelta
from typing import Any
import httpx
import pandas as pd
import pytz
import shortuuid
from tb_sdk.config import Service, PathToServices
from tb_sdk.connectors.constants import APIEndPoints
from tb_sdk.connectors.schemas.workflow_schema import FetchWorkFlowResponse, ListWorkflowRequest
from tb_sdk.connectors.utils.common_utils import CommonUtils
class Workflows:
def __init__(self):
self.form_render_url = f"{PathToServices.FORM_MT}{APIEndPoints.render_proxy}"
self.workflow_task_url = f"{PathToServices.WORKFLOW_MT}{APIEndPoints.workflow_proxy}"
def fetch_workflow_details(self, request_data: FetchWorkFlowResponse, user_id: str, project_id: str):
"""
Info:
The fetch_workflow_details function is used to fetch the workflow details from the Workflow Task API.
Usage:
workflow_obj = Workflows()
workflow_obj.fetch_workflow_details(request_data, user_id, project_id)
Args:
request_data (FetchWorkFlowResponse(Pydantic Model)): The request data containing the parameters required to fetch workflow details.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Response containing workflow details.
"""
try:
common_utils = CommonUtils()
api_url = f"{self.workflow_task_url}{APIEndPoints.api_get}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post", request_cookies=cookies)
except Exception as e:
print(e)
raise
def list_workflows(self, request_data: ListWorkflowRequest, user_id: str, project_id: str):
"""
Info:
The list_workflows function is used to list all the workflows for a given project.
Usage:
workflow_obj = Workflows()
workflow_obj.list_workflows(request_data, user_id, project_id)
Args:
request_data (ListWorkflowRequest(Pydantic Model)): The request object containing the parameters required to list workflows.
user_id (str): The id of the user who is saving this form.
project_id (str): The id of the project that this form belongs to.
Returns:
response (Object): Response containing list of workflow details.
"""
try:
common_utils = CommonUtils()
api_url = f"{self.workflow_task_url}{APIEndPoints.api_list_workflow_data}"
cookies = common_utils.generate_cookie_from_user_project(user_id=user_id, project_id=project_id)
return common_utils.hit_external_service(api_url=api_url, payload=request_data.dict(), method="post", request_cookies=cookies)
except Exception as e:
print(e)
raise
import base64
import json
import math
import os
import re
import time
import traceback
from datetime import datetime, timedelta, timezone
from functools import lru_cache, wraps
from typing import List, Optional
import httpx
import pandas as pd
import pendulum
import pytz
from dateutil import parser
from tb_sdk.connectors.constants import CommonConstants, CommonKeys
from tb_sdk.connectors.constants.app_constants import CommonStatusCode
from tb_sdk.connectors.constants.secrets import Secrets
from tb_sdk.connectors.db.mongo import mongo_client
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.user import User
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.user_project import UserProject
from tb_sdk.connectors.schemas.meta import ExternRequest, MetaInfoSchema
from tb_sdk.connectors.utils.security_utils.apply_encryption_util import create_token
from tb_sdk.connectors.utils.security_utils.jwt_util import JWT
def timed_lru_cache(seconds: int = 10, maxsize: int = 128):
def wrapper_cache(func):
func = lru_cache(maxsize=maxsize)(func)
func.lifetime = timedelta(seconds=seconds)
func.expiration = datetime.now(timezone.utc) + func.lifetime
@wraps(func)
def wrapped_func(*args, **kwargs):
if datetime.now(timezone.utc) >= func.expiration:
print("Cache Expired")
func.cache_clear()
func.expiration = datetime.now(timezone.utc) + func.lifetime
return func(*args, **kwargs)
return wrapped_func
return wrapper_cache
class CommonUtils(CommonKeys, CommonConstants):
def __init__(self):
self.user_conn = User(mongo_client)
self.user_proj = UserProject(mongo_client)
def get_user_roles_by_project_id(self, user_id, project_id):
user_rec = self.user_conn.find_user_by_project_id(user_id=user_id, project_id=project_id)
user_rec = user_rec if bool(user_rec) else {}
if not user_rec:
user_rec = self.user_proj.fetch_user_project(user_id=user_id, project_id=project_id)
user_rec = user_rec if bool(user_rec) else {}
return user_rec.get("userrole", [])
@staticmethod
def create_token(host: str = "127.0.0.1", user_id=None, internal_token=Secrets.token, project_id=None):
"""
This method is to create a cookie
"""
try:
if user_id is None:
user_id = "user_099"
return create_token(user_id=user_id, ip=host, token=internal_token, project_id=project_id)
except Exception as e:
print(e)
raise
@staticmethod
def hit_external_service(
api_url,
payload=None,
request_cookies=None,
timeout=int(os.environ.get("REQUEST_TIMEOUT", default=30)),
method="post",
params=None,
auth=None,
headers=None,
):
try:
print("Inside function to hit external services" f"\nURL - {api_url}")
payload_json = ExternRequest(
url=api_url, timeout=timeout, cookies=request_cookies, params=params, auth=auth, headers=headers
)
payload_json = payload_json.dict(exclude_none=True)
if payload:
payload_json.update(json=payload)
with httpx.Client() as client:
for _ in range(3):
method_type = getattr(client, method)
resp = method_type(**payload_json)
print(f"Resp Code:{resp.status_code}")
if resp.status_code in CommonStatusCode.SUCCESS_CODES:
return resp.json()
elif resp.status_code == 404:
print(f"Module not found: {api_url}")
raise ModuleNotFoundError
elif resp.status_code == 401:
print(f"Unauthorized to execute request on {api_url}")
print(
f"Resp Message:{resp.status_code} \n", f"Cookies: {request_cookies} \n", f"Rest API: {api_url}"
)
time.sleep(3)
except Exception as e:
print(e)
raise
@staticmethod
def encode_using_jwt(file_name=None):
jwt = JWT()
payload = {}
if file_name:
payload.update({"file_name": file_name})
exp = datetime.now() + timedelta(minutes=30)
_extras = {"iss": Secrets.issuer, "exp": exp}
_payload = {**payload, **_extras}
new_token = jwt.encode(_payload)
return new_token
@staticmethod
def generate_cookie_from_user_project(user_id, project_id, ip="127.0.0.1"):
"""
The generate_cookie_from_user_project function generates a cookie from the user_id and project_id.
Args:
user_id (int): The id of the user to generate a cookie for.
project_id (int): The id of the project to generate a cookie for.
ip (str, optional): Defaults to &quot;127.0.0.2&quot;. The IP address that will be used in generating this token/cookie.
:param user_id: Create a token for the user
:param project_id: Create a cookie that is specific to the project
:param ip: Store the ip address of the user
:return: A metainfoschema object
"""
generated_new_token = create_token(user_id, ip, Secrets.token, project_id=project_id)
return MetaInfoSchema(
login_token=generated_new_token,
projectId=project_id,
project_id=project_id,
user_id=user_id,
ip_address=ip,
)
import os
from datetime import datetime, timezone
from typing import Dict, List, Optional
from pymongo import MongoClient
from pymongo.cursor import Cursor
from tb_sdk.connectors.db.redis_connections import project_details_db
from tb_sdk.connectors.utils.redis_db_name_util import get_db_name
META_SOFT_DEL: bool = os.getenv("META_SOFT_DEL", True)
class MongoConnect:
def __init__(self, uri):
try:
self.uri = uri
self.client = MongoClient(self.uri, connect=False)
except Exception as e:
print(str(e))
raise
def __call__(self, *args, **kwargs):
return self.client
def __repr__(self):
return f"Mongo Client(uri:{self.uri}, server_info={self.client.server_info()})"
class MongoCollectionBaseClass:
def __init__(self, mongo_client, database, collection, soft_delete: bool = META_SOFT_DEL):
self.client = mongo_client
self.database = database
self.collection = collection
self.__database = None
self.soft_delete = soft_delete
def __repr__(self):
return f"{self.__class__.__name__}(database={self.database}, collection={self.collection})"
@property
def project_id(self):
return self.project_id
@project_id.setter
def project_id(self, project_id):
if self.__database is None:
# storing original db name if None
self.__database = self.database
self.database = get_db_name(redis_client=project_details_db, project_id=project_id, database=self.__database)
def insert_one(self, data: Dict):
"""
The function is used to inserting a document to a collection in a Mongo Database.
:param data: Data to be inserted
:return: Insert ID
"""
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
response = collection.insert_one(data)
print(data)
return response.inserted_id
except Exception as e:
print(str(e))
raise
def insert_many(self, data: List):
"""
The function is used to inserting documents to a collection in a Mongo Database.
:param data: List of Data to be inserted
:return: Insert IDs
"""
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
response = collection.insert_many(data)
print(data)
return response.inserted_ids
except Exception as e:
print(str(e))
raise
def find(
self,
query: Dict,
filter_dict: Optional[Dict] = None,
sort=None,
collation: Optional[bool] = False,
skip: Optional[int] = 0,
limit: Optional[int] = None,
) -> Cursor:
"""
The function is used to query documents from a given collection in a Mongo Database
:param query: Query Dictionary
:param filter_dict: Filter Dictionary
:param sort: List of tuple with key and direction. [(key, -1), ...]
:param collation: can add rules for lettercase and accent marks.
:param skip: Skip Number
:param limit: Limit Number
:return: List of Documents
"""
if sort is None:
sort = []
if filter_dict is None:
filter_dict = {"_id": 0}
database_name = self.database
collection_name = self.collection
try:
db = self.client[database_name]
collection = db[collection_name]
if len(sort) > 0:
cursor = (
collection.find(
query,
filter_dict,
)
.sort(sort)
.skip(skip)
)
else:
cursor = collection.find(
query,
filter_dict,
).skip(skip)
if limit:
cursor = cursor.limit(limit)
if collation:
cursor = cursor.collation({"locale": "en"})
print(f"{query}, {filter_dict}")
return cursor
except Exception as e:
print(str(e))
raise
def count_documents(self, query: Dict, limit: Optional[int] = 1) -> Cursor:
"""
The function is used to count documents from a given collection in a Mongo Database
:param query: Query Dictionary
:param limit: Limit Number
:return: List of Documents
"""
database_name = self.database
collection_name = self.collection
try:
db = self.client[database_name]
collection = db[collection_name]
return collection.count_documents(query, limit=limit)
except Exception as e:
print(str(e))
raise
def find_one(self, query: Dict, filter_dict: Optional[Dict] = None):
try:
database_name = self.database
collection_name = self.collection
if filter_dict is None:
filter_dict = {"_id": 0}
db = self.client[database_name]
collection = db[collection_name]
print(f"{self.collection}, {query}, {filter_dict}")
return collection.find_one(query, filter_dict)
except Exception as e:
print(str(e))
raise
def update_one(self, query: Dict, data: Dict, upsert: bool = False):
"""
:param upsert:
:param query:
:param data:
:return:
"""
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
response = collection.update_one(query, {"$set": data}, upsert=upsert)
print(f"{self.collection}, {query}, {data}")
return response.modified_count
except Exception as e:
print(str(e))
raise
def update_to_set(self, query: Dict, param: str, data: Dict, upsert: bool = False):
"""
:param upsert:
:param query:
:param param:
:param data:
:return:
"""
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
response = collection.update_one(query, {"$addToSet": {param: data}}, upsert=upsert)
print(f"{self.collection}, {query}, {data}")
return response.modified_count
except Exception as e:
print(str(e))
raise
def update_many(self, query: Dict, data: Dict, upsert: bool = False):
"""
:param upsert:
:param query:
:param data:
:return:
"""
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
response = collection.update_many(query, {"$set": data}, upsert=upsert)
print(f"{query}, {data}")
return response.modified_count
except Exception as e:
print(str(e))
raise
def delete_many(self, query: Dict):
"""
:param query:
:return:
"""
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
response = collection.delete_many(query)
if self.soft_delete:
soft_del_query = [
{"$match": query},
{"$addFields": {"deleted": {"on": datetime.utcnow().replace(tzinfo=timezone.utc)}}},
{
"$merge": {
"into": {"db": f"deleted__{database_name}", "coll": collection_name},
}
},
]
collection.aggregate(soft_del_query)
print(query)
return response.deleted_count
except Exception as e:
print(str(e))
raise
def delete_one(self, query: Dict):
"""
:param query:
:return:
"""
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
if self.soft_delete:
soft_del_query = [
{"$match": query},
{"$addFields": {"deleted": {"on": datetime.utcnow().replace(tzinfo=timezone.utc)}}},
{
"$merge": {
"into": {"db": f"deleted__{database_name}", "coll": collection_name},
}
},
]
collection.aggregate(soft_del_query)
response = collection.delete_one(query)
print(query)
return response.deleted_count
except Exception as e:
print(str(e))
raise
def distinct(self, query_key: str, filter_json: Optional[Dict] = None):
"""
:param query_key:
:param filter_json:
:return:
"""
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
print(f"{query_key}, {filter_json}")
return collection.distinct(query_key, filter_json)
except Exception as e:
print(str(e))
raise
def aggregate(self, pipelines: List, collation=None):
try:
database_name = self.database
collection_name = self.collection
db = self.client[database_name]
collection = db[collection_name]
print(f"{self.collection}, {pipelines}")
if collation:
return collection.aggregate(pipelines, collation=collation)
return collection.aggregate(pipelines)
except Exception as e:
print(str(e))
raise
import json
from functools import lru_cache
@lru_cache()
def get_db_name(redis_client, project_id: str, database: str, delimiter="__"):
if not project_id:
return database
val = redis_client.get(project_id)
if val is None:
raise ValueError(f"Unknown Project, Project ID: {project_id} Not Found!!!")
val = json.loads(val)
if not val:
return database
# Get the prefix flag to apply project_id prefix to any db
prefix_condition = bool(val.get("source_meta", {}).get("add_prefix_to_database"))
if prefix_condition:
# Get the prefix name from mongo or default to project_id
prefix_name = val.get("source_meta", {}).get("prefix") or project_id
return f"{prefix_name}{delimiter}{database}"
return database
import uuid
from datetime import datetime, timedelta
from tb_sdk.connectors.constants.secrets import Secrets
from tb_sdk.connectors.db.redis_connections import login_db
from tb_sdk.connectors.errors import CustomError
from tb_sdk.connectors.utils.security_utils.jwt_util import JWT
jwt = JWT()
def create_token(
user_id,
ip,
token,
age=Secrets.LOCK_OUT_TIME_MINS,
login_token=None,
project_id=None,
):
"""
This method is to create a cookie
"""
try:
uid = login_token
if not uid:
uid = str(uuid.uuid4()).replace("-", "")
payload = {"ip": ip, "user_id": user_id, "token": token, "uid": uid, "age": age}
if project_id:
payload["project_id"] = project_id
exp = datetime.utcnow() + timedelta(minutes=age)
_extras = {"iss": Secrets.issuer, "exp": exp}
_payload = {**payload, **_extras}
new_token = jwt.encode(_payload)
# Add session to redis
login_db.set(uid, new_token)
login_db.expire(uid, timedelta(minutes=age))
return uid
except Exception as e:
raise CustomError(f"{str(e)}")
from secrets import compare_digest
from fastapi import HTTPException, Request, Response
from fastapi.openapi.models import APIKey, APIKeyIn
from fastapi.security import APIKeyCookie
from fastapi.security.api_key import APIKeyBase
from tb_sdk.config import Service
from tb_sdk.connectors.constants.secrets import Secrets
from tb_sdk.connectors.db.redis_connections import login_db
from tb_sdk.connectors.utils.security_utils.apply_encryption_util import create_token
from tb_sdk.connectors.utils.security_utils.jwt_util import JWT
class CookieAuthentication(APIKeyBase):
"""
Authentication backend using a cookie.
Internally, uses a JWT token to store the data.
"""
scheme: APIKeyCookie
cookie_name: str
cookie_secure: bool
def __init__(
self,
cookie_name: str = "login-token",
):
super().__init__()
self.model: APIKey = APIKey(**{"in": APIKeyIn.cookie}, name=cookie_name)
self.scheme_name = self.__class__.__name__
self.cookie_name = cookie_name
self.scheme = APIKeyCookie(name=self.cookie_name, auto_error=False)
self.login_redis = login_db
self.jwt = JWT()
async def __call__(self, request: Request, response: Response) -> str:
cookies = request.cookies
login_token = cookies.get("login-token")
if not login_token:
login_token = request.headers.get("login-token")
if not login_token:
raise HTTPException(status_code=401)
jwt_token = self.login_redis.get(login_token)
if not jwt_token:
raise HTTPException(status_code=401)
try:
decoded_token = self.jwt.validate(token=jwt_token)
if not decoded_token:
raise HTTPException(status_code=401)
except Exception as e:
raise HTTPException(status_code=401, detail=e.args)
user_id = decoded_token.get("user_id")
cookie_user_id = request.cookies.get(
"user_id",
request.cookies.get("userId", request.headers.get("userId", request.headers.get("user_id"))),
)
project_id = decoded_token.get("project_id")
cookie_project_id = request.cookies.get("projectId", request.headers.get("projectId"))
_token = decoded_token.get("token")
_age = int(decoded_token.get("age", Secrets.LOCK_OUT_TIME_MINS))
if any(
[
not compare_digest(Secrets.token, _token),
login_token != decoded_token.get("uid"),
cookie_user_id and not compare_digest(user_id, cookie_user_id),
project_id and cookie_project_id and not compare_digest(project_id, cookie_project_id),
]
):
raise HTTPException(status_code=401)
try:
new_token = create_token(
user_id=user_id,
ip=request.client.host,
token=Secrets.token,
age=_age,
login_token=login_token,
project_id=project_id,
)
except Exception as e:
raise HTTPException(status_code=401, detail=e.args)
response.set_cookie(
"login-token",
new_token,
samesite="strict",
httponly=True,
max_age=Secrets.LOCK_OUT_TIME_MINS * 60,
secure=Service.secure_cookie,
)
response.headers["login-token"] = new_token
return user_id
import jwt
from jwt.exceptions import ExpiredSignatureError, InvalidSignatureError, MissingRequiredClaimError
from tb_sdk.config import KeyPath
from tb_sdk.connectors.constants.secrets import Secrets
from tb_sdk.connectors.errors import AuthenticationError, ErrorMessages
class JWT:
def __init__(self):
self.max_login_age = Secrets.LOCK_OUT_TIME_MINS
self.issuer = Secrets.issuer
self.alg = Secrets.alg
self.public = KeyPath.public
self.private = KeyPath.private
def encode(self, payload):
try:
with open(self.private) as f:
key = f.read()
return jwt.encode(payload, key, algorithm=self.alg)
except Exception as e:
print(f"Exception while encoding JWT: {str(e)}")
raise
finally:
f.close()
def validate(self, token):
try:
with open(self.public) as f:
key = f.read()
payload = jwt.decode(
token,
key,
algorithms=self.alg,
leeway=Secrets.leeway_in_mins,
options={"require": ["exp", "iss"]},
)
return payload
except InvalidSignatureError:
raise AuthenticationError(ErrorMessages.ERROR003)
except ExpiredSignatureError:
raise AuthenticationError(ErrorMessages.ERROR002)
except MissingRequiredClaimError:
raise AuthenticationError(ErrorMessages.ERROR002)
except Exception as e:
print(f"Exception while validating JWT: {str(e)}")
raise
finally:
f.close()
def decode(self, token):
try:
with open(self.public) as f:
key = f.read()
return jwt.decode(token, key, algorithms=self.alg)
except Exception as e:
print(f"Exception while encoding JWT: {str(e)}")
raise
finally:
f.close()
import logging
import orjson as json
from fastapi import HTTPException, Request, status
from tb_sdk.connectors.db.mongo import mongo_client
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.user import User
from tb_sdk.connectors.db.mongo.ilens_configuration.collections.user_project import UserProject
from tb_sdk.connectors.db.redis_connections import user_role_permissions_redis
from tb_sdk.connectors.utils.common_utils import timed_lru_cache
@timed_lru_cache(seconds=60, maxsize=1000)
def get_user_role_id(user_id, project_id):
logging.debug("Fetching user role from DB")
user_conn = User(mongo_client=mongo_client)
if user_role := user_conn.find_user_role_for_user_id(user_id=user_id, project_id=project_id):
return user_role["userrole"][0]
# if user not found in primary collection, check if user is in project collection
user_proj_conn = UserProject(mongo_client=mongo_client)
if user_role := user_proj_conn.find_user_role_for_user_id(user_id=user_id, project_id=project_id):
return user_role["userrole"][0]
class RBAC:
def __init__(self, entity_name: str, operation: list[str]):
self.entity_name = entity_name
self.operation = operation
def check_permissions(self, user_id: str, project_id: str) -> dict[str, bool]:
user_role_id = get_user_role_id(user_id, project_id)
if not user_role_id:
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="User role not found!")
r_key = f"{project_id}__{user_role_id}" # eg: project_100__user_role_100
user_role_rec = user_role_permissions_redis.hget(r_key, self.entity_name)
if not user_role_rec:
return {}
user_role_rec = json.loads(user_role_rec)
if permission_dict := {i: True for i in self.operation if user_role_rec.get(i)}:
return permission_dict
else:
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Insufficient Permission!")
def __call__(self, request: Request) -> dict[str, bool]:
user_id = request.cookies.get("userId", request.headers.get("userId"))
project_id = request.cookies.get("projectId", request.headers.get("projectId"))
return self.check_permissions(user_id=user_id, project_id=project_id)
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment