Compare commits

..

No commits in common. "master" and "v3.1.1" have entirely different histories.

286 changed files with 3135 additions and 8010 deletions

3
.gitignore vendored
View File

@ -15,6 +15,3 @@ stepfunctions/*/build
# python test # python test
.coverage .coverage
.report/ .report/
# log
.log

View File

@ -1,62 +0,0 @@
# EC2インスタンス管理資材
## NLBゲートウェイ インスタンス 起動スクリプト
### 目的
Merck様のDB接続経路としてのNLBから、Auroraデータベースに踏み台アクセスを行うため、ゲートウェイの機能を提供するEC2インスタンスを稼働している。
EC2インスタンス内では、NLBからの任意のポートをAuroraデータベースのポートにフォワーディングするために、`socat`プロセスを動かす必要がある。
`socat`プロセスを動かすためのコマンドと、EC2インスタンス起動時にプロセスを実行するためのsystemdの設定を配置している。
### フォルダ構成
```txt
.
├── README.md -- 当ファイル
└── gateway
├── staging -- ステージング環境用設定
│ ├── public1-1 -- ap-northeast-1aのインスタンス要設定
│ │ ├── socat-dbconnection-1a.service -- systemdにsocatプロセスを登録するためのファイル
│ │ └── socat-portforwarding-1a.sh -- socatでAuroraデータベースにポートフォワーディングするためのシェルスクリプト
│ └── public2-1 -- ap-northeast-1dのインスタンス要設定
│ ├── socat-dbconnection-1d.service
│ └── socat-portforwarding-1d.sh
├── product -- 本番環境用設定
│ ├── public1-1
│ │ ├── socat-dbconnection-1a.service
│ │ └── socat-portforwarding-1a.sh
│ └── public2-1
│ ├── socat-dbconnection-1d.service
│ └── socat-portforwarding-1d.sh
```
### ファイル配置方法(両環境共通)
- 対象のゲートウェイEC2インスタンスにログインする
- セッションマネージャーでログインした場合は、`ec2-user`に切り替えること(`sudo su --login ec2-user`)
- 以下の操作を実行し、`socat`プロセスを起動するシェルスクリプトを配置する。
- **ap-northeast-1aのインスタンス(public-1-1)の場合**
- `sudo vi /opt/socat-portforwarding-1a.sh`コマンドを実行する。
- `ec2/gateway/<環境名>/public-1-1/socat-portforwarding-1a.sh`の内容をコピペして保存する。
- `sudo chmod 774 /opt/socat-portforwarding-1a.sh`コマンドを実行する。
- **ap-northeast-1dのインスタンス(public-2-1)の場合**
- `sudo vi /opt/socat-portforwarding-1d.sh`コマンドを実行する。
- `ec2/gateway/<環境名>/public-2-1/socat-portforwarding-1d.sh`の内容をコピペして保存する。
- `sudo chmod 774 /opt/socat-portforwarding-1d.sh`コマンドを実行する。
- 以下の操作を実行し、`socat`プロセスを常駐させるためのサービス設定ファイルを配置する。
- **ap-northeast-1aのインスタンス(public-1-1)の場合**
- `sudo vi /etc/systemd/system/socat-dbconnection-1a.service`コマンドを実行する。
- `ec2/gateway/<環境名>/public-1-1/socat-dbconnection-1a.service`の内容をコピペして保存する。
- `sudo chmod 774 /etc/systemd/system/socat-dbconnection-1a.service`コマンドを実行する。
- **ap-northeast-1dのインスタンス(public-2-1)の場合**
- `sudo vi /etc/systemd/system/socat-dbconnection-1d.service`コマンドを実行する。
- `ec2/gateway/<環境名>/public-2-1/socat-dbconnection-1d.service`の内容をコピペして保存する。
- `sudo chmod 774 /etc/systemd/system/socat-dbconnection-1d.service`コマンドを実行する。
- 以下の操作を実行し、`socat`プロセスを起動するスクリプトをsystemdに登録する
- **ap-northeast-1aのインスタンス(public-1-1)の場合**
- `sudo systemctl enable socat-dbconnection-1a.service`コマンドを実行し、サービスを有効化する。
- `sudo systemctl status socat-dbconnection-1a.service`コマンドを実行し、サービスのステータスを確認する。`enabled;`となっていればOK
- **ap-northeast-1dのインスタンス(public-2-1)の場合**
- `sudo systemctl enable socat-dbconnection-1d.service`コマンドを実行し、サービスを有効化する。
- `sudo systemctl status socat-dbconnection-1d.service`コマンドを実行し、サービスのステータスを確認する。`enabled;`となっていればOK

View File

@ -1,10 +0,0 @@
[Unit]
Description = socat-dbconnection-1a
[Service]
ExecStart = /opt/socat-portforwarding-1a.sh
Type = oneshot
RemainAfterExit = yes
[Install]
WantedBy = default.target

View File

@ -1,3 +0,0 @@
#!/bin/bash
socat tcp4-listen:40001,reuseaddr,fork TCP:mbj-newdwh2021-product-dbcluster-instance-1.chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &
socat tcp4-listen:50001,reuseaddr,fork TCP:mbj-newdwh2021-product-dbcluster.cluster-chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &

View File

@ -1,10 +0,0 @@
[Unit]
Description = socat-dbconnection-1d
[Service]
ExecStart = /opt/socat-portforwarding-1d.sh
Type = oneshot
RemainAfterExit = yes
[Install]
WantedBy = default.target

View File

@ -1,3 +0,0 @@
#!/bin/bash
socat tcp4-listen:40001,reuseaddr,fork TCP:mbj-newdwh2021-product-dbcluster-instance-1-ap-northeast-1d.chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &
socat tcp4-listen:50001,reuseaddr,fork TCP:mbj-newdwh2021-product-dbcluster.cluster-chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &

View File

@ -1,10 +0,0 @@
[Unit]
Description = socat-dbconnection-1a
[Service]
ExecStart = /opt/socat-portforwarding-1a.sh
Type = oneshot
RemainAfterExit = yes
[Install]
WantedBy = default.target

View File

@ -1,3 +0,0 @@
#!/bin/bash
socat tcp4-listen:40001,reuseaddr,fork TCP:mbj-newdwh2021-staging-dbcluster-instance-1.chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &
socat tcp4-listen:50001,reuseaddr,fork TCP:mbj-newdwh2021-staging-dbcluster.cluster-chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &

View File

@ -1,10 +0,0 @@
[Unit]
Description = socat-dbconnection-1d
[Service]
ExecStart = /opt/socat-portforwarding-1d.sh
Type = oneshot
RemainAfterExit = yes
[Install]
WantedBy = default.target

View File

@ -1,3 +0,0 @@
#!/bin/bash
socat tcp4-listen:40001,reuseaddr,fork TCP:mbj-newdwh2021-staging-dbcluster-instance-1-ap-northeast-1d.chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &
socat tcp4-listen:50001,reuseaddr,fork TCP:mbj-newdwh2021-staging-dbcluster.cluster-chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &

View File

@ -1,20 +0,0 @@
# Node.jsで実装されたLambdaの管理対象外ファイル群
package-lock.json
node_modules/
# ローカル確認用環境変数ファイル
.env
# Pythonの仮想環境ファイル
.venv
# pythonのキャッシュファイル
__pycache__/
# StepFunctionsステートメント定義変換後のフォルダ
stepfunctions/*/build
**/.vscode/settings.json
# python test
.coverage
.report/
# log
.log

View File

@ -1,15 +1,15 @@
FROM python:3.12-slim-bookworm FROM python:3.9
ENV TZ="Asia/Tokyo" ENV TZ="Asia/Tokyo"
# pythonの標準出力をバッファリングしないフラグ
ENV PYTHONUNBUFFERED=1
# pythonのバイトコードを生成しないフラグ
ENV PYTHONDONTWRITEBYTECODE=1
WORKDIR /usr/src/app WORKDIR /usr/src/app
COPY Pipfile Pipfile.lock ./ COPY Pipfile Pipfile.lock ./
RUN \ RUN \
apt update -y && \ apt update -y && \
# パッケージのセキュリティアップデートのみを適用するコマンド
apt install -y unattended-upgrades && \
unattended-upgrades && \
pip install --upgrade pip wheel setuptools && \
pip install pipenv --no-cache-dir && \ pip install pipenv --no-cache-dir && \
pipenv install --system --deploy && \ pipenv install --system --deploy && \
pip uninstall -y pipenv virtualenv-clone virtualenv pip uninstall -y pipenv virtualenv-clone virtualenv

View File

@ -11,7 +11,7 @@ test = "pytest tests/"
[packages] [packages]
boto3 = "*" boto3 = "*"
simple-salesforce = "==1.12.6" simple-salesforce = "==1.12.4"
tenacity = "*" tenacity = "*"
[dev-packages] [dev-packages]
@ -23,4 +23,4 @@ pytest-html = "*"
moto = "*" moto = "*"
[requires] [requires]
python_version = "3.12" python_version = "3.9"

File diff suppressed because it is too large Load Diff

View File

@ -4,7 +4,7 @@
### ツールのバージョン ### ツールのバージョン
- Python 3.12.x - Python 3.9.x
- PipEnv(Pythonの依存関係管理用モジュール) - PipEnv(Pythonの依存関係管理用モジュール)
### 開発環境 ### 開発環境

View File

@ -12,87 +12,87 @@ from src.system_var.environments import (CRM_BACKUP_BUCKET, CRM_CONFIG_BUCKET,
RESPONSE_JSON_BACKUP_FOLDER) RESPONSE_JSON_BACKUP_FOLDER)
class S3Client: class S3Resource:
def __init__(self, bucket_name: str) -> None: def __init__(self, bucket_name: str) -> None:
self.__s3_client = boto3.client(AWS_RESOURCE_S3) self.__s3_resource = boto3.resource(AWS_RESOURCE_S3)
self.__s3_bucket = bucket_name self.__s3_bucket = self.__s3_resource.Bucket(bucket_name)
def get_object(self, object_key: str) -> str: def get_object(self, object_key: str) -> str:
response = self.__s3_client.get_object(Bucket=self.__s3_bucket, Key=object_key) response = self.__s3_bucket.Object(object_key).get()
body = response[S3_RESPONSE_BODY].read() body = response[S3_RESPONSE_BODY].read()
return body.decode(S3_CHAR_CODE) return body.decode(S3_CHAR_CODE)
def put_object(self, object_key: str, local_file_path: str) -> None: def put_object(self, object_key: str, local_file_path: str) -> None:
self.__s3_client.upload_file(Filename=local_file_path, Bucket=self.__s3_bucket, Key=object_key) self.__s3_bucket.upload_file(Key=object_key, Filename=local_file_path)
return return
def copy(self, src_bucket: str, src_key: str, dest_bucket: str, dest_key: str) -> None: def copy(self, src_bucket: str, src_key: str, dest_bucket: str, dest_key: str) -> None:
copy_source = {'Bucket': src_bucket, 'Key': src_key} copy_source = {'Bucket': src_bucket, 'Key': src_key}
self.__s3_client.copy_object(CopySource=copy_source, Bucket=dest_bucket, Key=dest_key) self.__s3_resource.meta.client.copy(copy_source, dest_bucket, dest_key)
return return
class ConfigBucket: class ConfigBucket:
__s3_client: S3Client = None __s3_resource: S3Resource = None
def __init__(self) -> None: def __init__(self) -> None:
self.__s3_client = S3Client(CRM_CONFIG_BUCKET) self.__s3_resource = S3Resource(CRM_CONFIG_BUCKET)
def __str__(self) -> str: def __str__(self) -> str:
return CRM_CONFIG_BUCKET return CRM_CONFIG_BUCKET
def get_object_info_file(self) -> str: def get_object_info_file(self) -> str:
return self.__s3_client.get_object(f'{OBJECT_INFO_FOLDER}/{OBJECT_INFO_FILENAME}') return self.__s3_resource.get_object(f'{OBJECT_INFO_FOLDER}/{OBJECT_INFO_FILENAME}')
def get_last_fetch_datetime_file(self, file_key: str) -> str: def get_last_fetch_datetime_file(self, file_key: str) -> str:
return self.__s3_client.get_object(f'{LAST_FETCH_DATE_FOLDER}/{file_key}') return self.__s3_resource.get_object(f'{LAST_FETCH_DATE_FOLDER}/{file_key}')
def put_last_fetch_datetime_file(self, file_key: str, local_file_path: str) -> None: def put_last_fetch_datetime_file(self, file_key: str, local_file_path: str) -> None:
self.__s3_client.put_object( self.__s3_resource.put_object(
f'{LAST_FETCH_DATE_FOLDER}/{file_key}', local_file_path) f'{LAST_FETCH_DATE_FOLDER}/{file_key}', local_file_path)
return return
class DataBucket: class DataBucket:
__s3_client: S3Client = None __s3_resource: S3Resource = None
def __init__(self) -> None: def __init__(self) -> None:
self.__s3_client = S3Client(IMPORT_DATA_BUCKET) self.__s3_resource = S3Resource(IMPORT_DATA_BUCKET)
def __str__(self) -> str: def __str__(self) -> str:
return IMPORT_DATA_BUCKET return IMPORT_DATA_BUCKET
def put_csv(self, file_key: str, local_file_path: str) -> None: def put_csv(self, file_key: str, local_file_path: str) -> None:
object_key = f'{CRM_IMPORT_DATA_FOLDER}/{file_key}' object_key = f'{CRM_IMPORT_DATA_FOLDER}/{file_key}'
self.__s3_client.put_object(object_key, local_file_path) self.__s3_resource.put_object(object_key, local_file_path)
return return
def put_csv_from(self, src_bucket: str, src_key: str): def put_csv_from(self, src_bucket: str, src_key: str):
dest_filename = src_key.split('/')[-1] dest_filename = src_key.split('/')[-1]
self.__s3_client.copy(src_bucket, src_key, str(self), f'{CRM_IMPORT_DATA_FOLDER}/{dest_filename}') self.__s3_resource.copy(src_bucket, src_key, str(self), f'{CRM_IMPORT_DATA_FOLDER}/{dest_filename}')
return return
class BackupBucket: class BackupBucket:
__s3_client: S3Client = None __s3_resource: S3Resource = None
def __init__(self) -> None: def __init__(self) -> None:
self.__s3_client = S3Client(CRM_BACKUP_BUCKET) self.__s3_resource = S3Resource(CRM_BACKUP_BUCKET)
def __str__(self) -> str: def __str__(self) -> str:
return CRM_BACKUP_BUCKET return CRM_BACKUP_BUCKET
def put_response_json(self, file_key: str, local_file_path: str) -> None: def put_response_json(self, file_key: str, local_file_path: str) -> None:
object_key = f'{RESPONSE_JSON_BACKUP_FOLDER}/{file_key}' object_key = f'{RESPONSE_JSON_BACKUP_FOLDER}/{file_key}'
self.__s3_client.put_object(object_key, local_file_path) self.__s3_resource.put_object(object_key, local_file_path)
return return
def put_csv(self, file_key: str, local_file_path: str) -> None: def put_csv(self, file_key: str, local_file_path: str) -> None:
object_key = f'{CRM_IMPORT_DATA_BACKUP_FOLDER}/{file_key}' object_key = f'{CRM_IMPORT_DATA_BACKUP_FOLDER}/{file_key}'
self.__s3_client.put_object(object_key, local_file_path) self.__s3_resource.put_object(object_key, local_file_path)
return return
def put_result_json(self, file_key: str, local_file_path: str) -> None: def put_result_json(self, file_key: str, local_file_path: str) -> None:
object_key = f'{PROCESS_RESULT_FOLDER}/{file_key}' object_key = f'{PROCESS_RESULT_FOLDER}/{file_key}'
self.__s3_client.put_object(object_key, local_file_path) self.__s3_resource.put_object(object_key, local_file_path)
return return

View File

@ -1,8 +1,7 @@
import os import os
import pytest import pytest
from src.aws.s3 import BackupBucket, ConfigBucket, DataBucket, S3Resource
from src.aws.s3 import BackupBucket, ConfigBucket, DataBucket, S3Client
@pytest.fixture @pytest.fixture
@ -16,7 +15,7 @@ def s3_test(s3_client, bucket_name):
yield yield
class TestS3Client: class TestS3Resource:
def test_get_object(self, s3_test, s3_client, bucket_name): def test_get_object(self, s3_test, s3_client, bucket_name):
""" """
@ -32,7 +31,7 @@ class TestS3Client:
s3_client.put_object(Bucket=bucket_name, Key='hogehoge/test.txt', Body=b'aaaaaaaaaaaaaaa') s3_client.put_object(Bucket=bucket_name, Key='hogehoge/test.txt', Body=b'aaaaaaaaaaaaaaa')
# Act # Act
sut = S3Client(bucket_name) sut = S3Resource(bucket_name)
actual = sut.get_object('hogehoge/test.txt') actual = sut.get_object('hogehoge/test.txt')
# Assert # Assert
@ -49,7 +48,7 @@ class TestS3Client:
""" """
# Arrange # Arrange
# Act # Act
sut = S3Client(bucket_name) sut = S3Resource(bucket_name)
with pytest.raises(Exception): with pytest.raises(Exception):
# Assert # Assert
sut.get_object('hogehoge/test.txt') sut.get_object('hogehoge/test.txt')
@ -69,7 +68,7 @@ class TestS3Client:
with open(file_path, mode='w') as f: with open(file_path, mode='w') as f:
f.write('aaaaaaaaaaaaaaa') f.write('aaaaaaaaaaaaaaa')
sut = S3Client(bucket_name) sut = S3Resource(bucket_name)
sut.put_object('hogehoge/test.txt', file_path) sut.put_object('hogehoge/test.txt', file_path)
actual = s3_client.get_object(Bucket=bucket_name, Key='hogehoge/test.txt') actual = s3_client.get_object(Bucket=bucket_name, Key='hogehoge/test.txt')
@ -88,7 +87,7 @@ class TestS3Client:
""" """
# Arrange # Arrange
# Act # Act
sut = S3Client(bucket_name) sut = S3Resource(bucket_name)
with pytest.raises(Exception): with pytest.raises(Exception):
# Assert # Assert
sut.put_object('hogehoge/test.txt', 'aaaaaaaaaaaaaaa') sut.put_object('hogehoge/test.txt', 'aaaaaaaaaaaaaaa')
@ -109,7 +108,7 @@ class TestS3Client:
s3_client.create_bucket(Bucket=for_copy_bucket) s3_client.create_bucket(Bucket=for_copy_bucket)
s3_client.put_object(Bucket=bucket_name, Key='hogehoge/test.txt', Body=b'aaaaaaaaaaaaaaa') s3_client.put_object(Bucket=bucket_name, Key='hogehoge/test.txt', Body=b'aaaaaaaaaaaaaaa')
sut = S3Client(bucket_name) sut = S3Resource(bucket_name)
sut.copy(bucket_name, 'hogehoge/test.txt', for_copy_bucket, 'test.txt') sut.copy(bucket_name, 'hogehoge/test.txt', for_copy_bucket, 'test.txt')
actual = s3_client.get_object(Bucket=for_copy_bucket, Key='test.txt') actual = s3_client.get_object(Bucket=for_copy_bucket, Key='test.txt')
@ -126,7 +125,7 @@ class TestS3Client:
""" """
# Arrange # Arrange
# Act # Act
sut = S3Client(bucket_name) sut = S3Resource(bucket_name)
with pytest.raises(Exception): with pytest.raises(Exception):
# Assert # Assert
sut.copy(bucket_name, 'hogehoge/test.txt', 'for_copy_bucket', 'test.txt') sut.copy(bucket_name, 'hogehoge/test.txt', 'for_copy_bucket', 'test.txt')
@ -141,8 +140,8 @@ class TestS3Client:
- インスタンス生成時に例外が発生すること - インスタンス生成時に例外が発生すること
""" """
with pytest.raises(Exception) as e: with pytest.raises(Exception) as e:
S3Client() S3Resource()
assert e.value.args[0] == "S3Client.__init__() missing 1 required positional argument: 'bucket_name'" assert e.value.args[0] == "__init__() missing 1 required positional argument: 'bucket_name'"
class TestConfigBucket: class TestConfigBucket:

View File

@ -4,7 +4,8 @@ from datetime import datetime
import boto3 import boto3
import pytest import pytest
from moto import mock_aws from moto import mock_s3
from py.xml import html # type: ignore
from . import docstring_parser from . import docstring_parser
@ -20,7 +21,7 @@ def aws_credentials():
@pytest.fixture @pytest.fixture
def s3_client(aws_credentials): def s3_client(aws_credentials):
with mock_aws(): with mock_s3():
conn = boto3.client("s3", region_name="us-east-1") conn = boto3.client("s3", region_name="us-east-1")
yield conn yield conn
@ -34,18 +35,18 @@ def pytest_html_report_title(report):
def pytest_html_results_table_header(cells): def pytest_html_results_table_header(cells):
del cells[2:] del cells[2:]
cells.insert(3, '<th>Cases</th>') cells.insert(3, html.th("Cases"))
cells.insert(4, '<th>Arranges</th>') cells.insert(4, html.th("Arranges"))
cells.insert(5, '<th>Expects</th>') cells.insert(5, html.th("Expects"))
cells.append('<th class="sortable time" col="time">Time</th>') cells.append(html.th("Time", class_="sortable time", col="time"))
def pytest_html_results_table_row(report, cells): def pytest_html_results_table_row(report, cells):
del cells[2:] del cells[2:]
cells.insert(3, f'<td><pre>{report.cases}</pre></td>') # 「テスト内容」をレポートに出力 cells.insert(3, html.td(html.pre(report.cases))) # 「テスト内容」をレポートに出力
cells.insert(4, f'<td><pre>{report.arranges}</pre></td>') # 「期待結果」をレポートに出力 cells.insert(4, html.td(html.pre(report.arranges))) # 「期待結果」をレポートに出力
cells.insert(5, f'<td><pre>{report.expects}</pre></td>') # 「期待結果」をレポートに出力 cells.insert(5, html.td(html.pre(report.expects))) # 「期待結果」をレポートに出力
cells.append(f'<td class="col-time">{datetime.now()}</td>') # ついでに「時間」もレポートに出力 cells.append(html.td(datetime.now(), class_="col-time")) # ついでに「時間」もレポートに出力
@pytest.hookimpl(hookwrapper=True) @pytest.hookimpl(hookwrapper=True)

View File

@ -652,8 +652,7 @@ class TestSalesforceApiClient:
actual = sut.fetch_sf_data(soql) actual = sut.fetch_sf_data(soql)
assert len(actual) > 0 assert len(actual) > 0
print(dict(actual[0])) assert dict(actual[0])["RelationshipTest__r"]["RecordType"]["DeveloperName"] == "RecordTypeSpecial"
assert dict(actual[0])["RelationshipTest__r"]["RecordType"]["DeveloperName"] == "RecordTypeNormal"
def test_raise_create_instance_cause_auth_failed(self, monkeypatch): def test_raise_create_instance_cause_auth_failed(self, monkeypatch):
""" """

View File

@ -6,7 +6,6 @@ from datetime import datetime, timezone
import boto3 import boto3
import pytest import pytest
from src.controller import controller from src.controller import controller
from src.parser.json_parser import JsonParser from src.parser.json_parser import JsonParser
from src.system_var.constants import YYYYMMDDTHHMMSSTZ from src.system_var.constants import YYYYMMDDTHHMMSSTZ
@ -115,10 +114,6 @@ def test_walk_through(s3_test, s3_client, monkeypatch, caplog):
logger.info(f'##########################') logger.info(f'##########################')
# Assertion # Assertion
log_messages = caplog.messages log_messages = caplog.messages
# ログの目視確認を容易にするため、ローカルファイルに書き出す。
with open('crm_datafetch_test_walk_through_diff.log', 'w', encoding='utf8') as f:
f.write('\n'.join(log_messages))
# ループ前のログ確認 # ループ前のログ確認
assert 'I-CTRL-01 CRMデータ取得処理を開始します' in log_messages assert 'I-CTRL-01 CRMデータ取得処理を開始します' in log_messages
assert 'I-CTRL-02 データ取得準備処理呼び出し' in log_messages assert 'I-CTRL-02 データ取得準備処理呼び出し' in log_messages
@ -175,10 +170,6 @@ def test_walk_through(s3_test, s3_client, monkeypatch, caplog):
logger.info(f'##########################') logger.info(f'##########################')
# ログ再取得 # ログ再取得
log_messages_all = caplog.messages log_messages_all = caplog.messages
# ログの目視確認を容易にするため、ローカルファイルに書き出す。
with open('crm_datafetch_test_walk_through_all.log', 'w', encoding='utf8') as f:
f.write('\n'.join(log_messages_all))
object_info_list_all = object_info_files[1] object_info_list_all = object_info_files[1]
# 開始ログなどはテスト済みなのでチェックを省く # 開始ログなどはテスト済みなのでチェックを省く
for object_info in object_info_list_all['objects']: for object_info in object_info_list_all['objects']:

View File

@ -99,9 +99,8 @@ class TestCounterObject:
sut = CounterObject() sut = CounterObject()
sut.describe(1) sut.describe(1)
print(str(e.value))
# Expects # Expects
assert str(e.value) == 'CounterObject.describe() takes 1 positional argument but 2 were given' assert str(e.value) == 'describe() takes 1 positional argument but 2 were given'
def test_increment(self) -> int: def test_increment(self) -> int:
""" """

View File

@ -1,19 +1,16 @@
FROM python:3.12-slim-bookworm FROM python:3.9
ENV TZ="Asia/Tokyo" ENV TZ="Asia/Tokyo"
# pythonの標準出力をバッファリングしないフラグ
ENV PYTHONUNBUFFERED=1
# pythonのバイトコードを生成しないフラグ
ENV PYTHONDONTWRITEBYTECODE=1
WORKDIR /usr/src/app WORKDIR /usr/src/app
COPY Pipfile Pipfile.lock ./ COPY requirements.txt ./
RUN \ RUN \
apt update -y && \ apt update -y && \
pip install pipenv --no-cache-dir && \ # パッケージのセキュリティアップデートのみを適用するコマンド
pipenv install --system --deploy && \ apt install -y unattended-upgrades && \
pip uninstall -y pipenv virtualenv-clone virtualenv unattended-upgrades && \
pip install --upgrade pip wheel setuptools && \
pip install --no-cache-dir -r requirements.txt
COPY dataimport ./ COPY dataimport ./
CMD [ "python", "./controller.py" ] CMD [ "python", "./controller.py" ]

View File

@ -1,13 +0,0 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
boto3 = "*"
pymysql = "*"
[dev-packages]
[requires]
python_version = "3.12"

View File

@ -1,87 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "1738beec0de1a16f127d9bbeef1c9cb1ffb5b2377aa1aedbce9bfacae0fa1c67"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.12"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"boto3": {
"hashes": [
"sha256:3faa2c328a61745f3215a63039606a6fcf55d9afe1cc76e3a5e27b9db58cdbf6",
"sha256:b998edac72f6740bd5d9d585cf3880f2dfeb4842e626b34430fd0e9623378011"
],
"index": "pypi",
"markers": "python_version >= '3.9'",
"version": "==1.38.32"
},
"botocore": {
"hashes": [
"sha256:0899a090e352cb5eeaae2c7bb52a987b469d23912c7ece86664dfb5c2e074978",
"sha256:64ab919a5d8b74dd73eaac1f978d0e674d11ff3bbe8815c3d2982477be9a082c"
],
"markers": "python_version >= '3.9'",
"version": "==1.38.32"
},
"jmespath": {
"hashes": [
"sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980",
"sha256:90261b206d6defd58fdd5e85f478bf633a2901798906be2ad389150c5c60edbe"
],
"markers": "python_version >= '3.7'",
"version": "==1.0.1"
},
"pymysql": {
"hashes": [
"sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c",
"sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0"
],
"index": "pypi",
"markers": "python_version >= '3.7'",
"version": "==1.1.1"
},
"python-dateutil": {
"hashes": [
"sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3",
"sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'",
"version": "==2.9.0.post0"
},
"s3transfer": {
"hashes": [
"sha256:0148ef34d6dd964d0d8cf4311b2b21c474693e57c2e069ec708ce043d2b527be",
"sha256:f5e6db74eb7776a37208001113ea7aa97695368242b364d73e91c981ac522177"
],
"markers": "python_version >= '3.9'",
"version": "==0.13.0"
},
"six": {
"hashes": [
"sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274",
"sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'",
"version": "==1.17.0"
},
"urllib3": {
"hashes": [
"sha256:414bc6535b787febd7567804cc015fee39daab8ad86268f1310a9250697de466",
"sha256:4e16665048960a0900c702d4a66415956a584919c03361cac9f1df5c5dd7e813"
],
"markers": "python_version >= '3.9'",
"version": "==2.4.0"
}
},
"develop": {}
}

View File

@ -4,6 +4,7 @@ import sys
from datetime import datetime from datetime import datetime
import boto3 import boto3
from common import convert_quotechar, debug_log from common import convert_quotechar, debug_log
from end import end from end import end
from error import error from error import error
@ -40,7 +41,7 @@ LINE_FEED_CODE = {
} }
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_resource = boto3.resource('s3')
# チェック例外クラス # チェック例外クラス
@ -73,14 +74,16 @@ def check(bucket_name, target_data_source, target_file_name, settings_key, log_i
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-CHK-01 - チェック処理を開始します') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-CHK-01 - チェック処理を開始します')
# データ読込 # データ読込
settings_obj_response = s3_client.get_object(Bucket=bucket_name, Key=settings_key) settings_obj = s3_resource.Object(bucket_name, settings_key)
settings_response = settings_obj.get()
settings_list = [] settings_list = []
for line in io.TextIOWrapper(io.BytesIO(settings_obj_response["Body"].read()), encoding='utf-8'): for line in io.TextIOWrapper(io.BytesIO(settings_response["Body"].read()), encoding='utf-8'):
settings_list.append(line.rstrip('\n')) settings_list.append(line.rstrip('\n'))
work_key = target_data_source + DIRECTORY_WORK + target_file_name work_key = target_data_source + DIRECTORY_WORK + target_file_name
work_obj_response = s3_client.get_object(Bucket=bucket_name, Key=work_key) work_obj = s3_resource.Object(bucket_name, work_key)
work_data = io.TextIOWrapper(io.BytesIO(work_obj_response["Body"].read()), encoding=settings_list[SETTINGS_ITEM["charCode"]], newline=LINE_FEED_CODE[settings_list[SETTINGS_ITEM["lineFeedCode"]]]) work_response = work_obj.get()
work_data = io.TextIOWrapper(io.BytesIO(work_response["Body"].read()), encoding=settings_list[SETTINGS_ITEM["charCode"]], newline=LINE_FEED_CODE[settings_list[SETTINGS_ITEM["lineFeedCode"]]])
work_csv_row = [] work_csv_row = []
for i, line in enumerate(csv.reader(work_data, quotechar=convert_quotechar(settings_list[SETTINGS_ITEM["quotechar"]]), delimiter=settings_list[SETTINGS_ITEM["delimiter"]])): for i, line in enumerate(csv.reader(work_data, quotechar=convert_quotechar(settings_list[SETTINGS_ITEM["quotechar"]]), delimiter=settings_list[SETTINGS_ITEM["delimiter"]])):
# ヘッダあり、かつ、1行目の場合 # ヘッダあり、かつ、1行目の場合
@ -145,16 +148,3 @@ def is_empty_file(work_csv_row: list, settings_list: list):
return len(work_csv_row[1:]) == 0 return len(work_csv_row[1:]) == 0
return len(work_csv_row) == 0 return len(work_csv_row) == 0
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# check(
# bucket_name='バケット名',
# target_data_source='データソース名',
# target_file_name='targetフォルダ内のファイル名',
# settings_key='個別設定ファイル名',
# log_info='Info',
# mode='i'
# )

View File

@ -1,8 +1,7 @@
from datetime import datetime from datetime import datetime
import boto3 import boto3
from common import debug_log
from error import error from error import error
from common import debug_log
# 定数 # 定数
LOG_LEVEL = {'i': 'Info', 'e': 'Error'} LOG_LEVEL = {'i': 'Info', 'e': 'Error'}
@ -13,6 +12,7 @@ DIRECTORY_WARNING = '/warning/'
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
def end(bucket_name, target_data_source, target_file_name, warning_info, log_info, mode): def end(bucket_name, target_data_source, target_file_name, warning_info, log_info, mode):
@ -45,7 +45,8 @@ def end(bucket_name, target_data_source, target_file_name, warning_info, log_inf
} }
done_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}' done_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}'
done_key = target_data_source + DIRECTORY_DONE + done_file_name done_key = target_data_source + DIRECTORY_DONE + done_file_name
s3_client.copy(CopySource=copy_source, Bucket=bucket_name, Key=done_key) done_obj = s3_resource.Object(bucket_name, done_key)
done_obj.copy(copy_source)
s3_client.delete_object(Bucket=bucket_name, Key=work_key) s3_client.delete_object(Bucket=bucket_name, Key=work_key)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-02 - workディレクトリの {target_file_name} をdoneディレクトリに移動しました 移動後ファイル名:{done_file_name}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-02 - workディレクトリの {target_file_name} をdoneディレクトリに移動しました 移動後ファイル名:{done_file_name}')
@ -63,20 +64,23 @@ def end(bucket_name, target_data_source, target_file_name, warning_info, log_inf
# warningファイルの作成 # warningファイルの作成
warning_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}_war.log' warning_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}_war.log'
warning_key = target_data_source + DIRECTORY_WARNING + warning_file_name warning_key = target_data_source + DIRECTORY_WARNING + warning_file_name
s3_client.put_object(Bucket=bucket_name, Key=warning_key, Body=bytes(warning_info, 'utf-8')) warning_obj = s3_resource.Object(bucket_name, warning_key)
warning_obj.put(Body=warning_info)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-06 - warningディレクトリに {warning_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-06 - warningディレクトリに {warning_file_name} を作成しました')
# warning処理結果ファイルの作成 # warning処理結果ファイルの作成
result_warning_file_name = target_file_name + '.warning' result_warning_file_name = target_file_name + '.warning'
result_warning_key = target_data_source + DIRECTORY_TARGET + result_warning_file_name result_warning_key = target_data_source + DIRECTORY_TARGET + result_warning_file_name
s3_client.put_object(Bucket=bucket_name, Key=result_warning_key, Body=b'') result_warning_obj = s3_resource.Object(bucket_name, result_warning_key)
result_warning_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-07 - targetディレクトリに {result_warning_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-07 - targetディレクトリに {result_warning_file_name} を作成しました')
else: else:
# done処理結果ファイルの作成 # done処理結果ファイルの作成
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-08 - Warning情報は存在しませんでした') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-08 - Warning情報は存在しませんでした')
result_done_file_name = target_file_name + '.done' result_done_file_name = target_file_name + '.done'
result_done_key = target_data_source + DIRECTORY_TARGET + result_done_file_name result_done_key = target_data_source + DIRECTORY_TARGET + result_done_file_name
s3_client.put_object(Bucket=bucket_name, Key=result_done_key, Body=b'') result_done_obj = s3_resource.Object(bucket_name, result_done_key)
result_done_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-09 - targetディレクトリに {result_done_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-09 - targetディレクトリに {result_done_file_name} を作成しました')
# ⑤ 終了処理終了ログを出力する # ⑤ 終了処理終了ログを出力する
@ -84,17 +88,3 @@ def end(bucket_name, target_data_source, target_file_name, warning_info, log_inf
except Exception as e: except Exception as e:
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-END-99 - エラー内容:{e}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-END-99 - エラー内容:{e}')
error(bucket_name, target_data_source, target_file_name, log_info) error(bucket_name, target_data_source, target_file_name, log_info)
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# end(
# bucket_name='バケット名',
# target_data_source='データソース名',
# target_file_name='targetフォルダ内のファイル',
# # warning_info='ワーニング内容', # ワーニングがある場合のテストはこちらを生かす
# warning_info='',
# log_info='Info',
# mode='i'
# )

View File

@ -1,7 +1,6 @@
import sys
from datetime import datetime from datetime import datetime
import boto3 import boto3
import sys
# 定数 # 定数
LOG_LEVEL = {'i': 'Info', 'e': 'Error'} LOG_LEVEL = {'i': 'Info', 'e': 'Error'}
@ -11,6 +10,7 @@ DIRECTORY_ERROR = '/error/'
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
def error(bucket_name, target_data_source, target_file_name, log_info): def error(bucket_name, target_data_source, target_file_name, log_info):
@ -34,7 +34,8 @@ def error(bucket_name, target_data_source, target_file_name, log_info):
} }
error_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}' error_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}'
error_key = target_data_source + DIRECTORY_ERROR + error_file_name error_key = target_data_source + DIRECTORY_ERROR + error_file_name
s3_client.copy(CopySource=copy_source, Bucket=bucket_name, Key=error_key) error_obj = s3_resource.Object(bucket_name, error_key)
error_obj.copy(copy_source)
s3_client.delete_object(Bucket=bucket_name, Key=work_key) s3_client.delete_object(Bucket=bucket_name, Key=work_key)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-02 - workディレクトリの {target_file_name} をerrorディレクトリに移動しました 移動後ファイル名:{error_file_name}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-02 - workディレクトリの {target_file_name} をerrorディレクトリに移動しました 移動後ファイル名:{error_file_name}')
@ -47,7 +48,8 @@ def error(bucket_name, target_data_source, target_file_name, log_info):
# ④ S3バケット内のtargetディレクトリに、「投入データファイル名.error」ファイルを作成する # ④ S3バケット内のtargetディレクトリに、「投入データファイル名.error」ファイルを作成する
result_error_file_name = target_file_name + '.error' result_error_file_name = target_file_name + '.error'
result_error_key = target_data_source + DIRECTORY_TARGET + result_error_file_name result_error_key = target_data_source + DIRECTORY_TARGET + result_error_file_name
s3_client.put_object(Bucket=bucket_name, Key=result_error_key, Body=b'') result_error_obj = s3_resource.Object(bucket_name, result_error_key)
result_error_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-04 - targetディレクトリに {result_error_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-04 - targetディレクトリに {result_error_file_name} を作成しました')
except Exception as e: except Exception as e:
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-ERR-99 - エラー内容:{e}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-ERR-99 - エラー内容:{e}')
@ -79,14 +81,16 @@ def error_doing_file_exists(bucket_name, target_key, target_data_source, target_
} }
error_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}' error_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}'
error_key = target_data_source + DIRECTORY_ERROR + error_file_name error_key = target_data_source + DIRECTORY_ERROR + error_file_name
s3_client.copy(CopySource=copy_source, Bucket=bucket_name, Key=error_key) error_obj = s3_resource.Object(bucket_name, error_key)
error_obj.copy(copy_source)
s3_client.delete_object(Bucket=bucket_name, Key=target_key) s3_client.delete_object(Bucket=bucket_name, Key=target_key)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-07 - targetディレクトリの {target_file_name} をerrorディレクトリに移動しました 移動後ファイル名:{error_file_name}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-07 - targetディレクトリの {target_file_name} をerrorディレクトリに移動しました 移動後ファイル名:{error_file_name}')
# ③ S3バケット内のtargetディレクトリに、「投入データファイル名.exclusive_error」ファイルを作成する # ③ S3バケット内のtargetディレクトリに、「投入データファイル名.exclusive_error」ファイルを作成する
result_error_file_name = target_file_name + '.exclusive_error' result_error_file_name = target_file_name + '.exclusive_error'
result_error_key = target_data_source + DIRECTORY_TARGET + result_error_file_name result_error_key = target_data_source + DIRECTORY_TARGET + result_error_file_name
s3_client.put_object(Bucket=bucket_name, Key=result_error_key, Body=b'') result_error_obj = s3_resource.Object(bucket_name, result_error_key)
result_error_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-08 - targetディレクトリに {result_error_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-08 - targetディレクトリに {result_error_file_name} を作成しました')
except Exception as e: except Exception as e:
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-ERR-99 - エラー内容:{e}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-ERR-99 - エラー内容:{e}')
@ -96,24 +100,3 @@ def error_doing_file_exists(bucket_name, target_key, target_data_source, target_
# ⑤ 処理を終了する # ⑤ 処理を終了する
sys.exit() sys.exit()
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# エラー処理
# error(
# bucket_name='バケット名',
# target_data_source='データソース名',
# target_file_name='データソース名/target/ファイル名',
# log_info='Info'
# )
# doingファイルの有無チェック関数
# error_doing_file_exists(
# bucket_name='バケット名',
# target_key='投入データのフルパス',
# target_data_source='投入データのディレクトリ名よりデータソースに該当する部分',
# target_file_name='投入データのファイル名',
# log_info='Info'
# )

View File

@ -1,12 +1,12 @@
import csv from datetime import datetime
import boto3
import io import io
import csv
import re import re
import sys import sys
from datetime import datetime from error import error
from error import error_doing_file_exists
import boto3
from common import debug_log from common import debug_log
from error import error, error_doing_file_exists
# 定数 # 定数
LOG_LEVEL = {"i": 'Info', "e": 'Error'} LOG_LEVEL = {"i": 'Info', "e": 'Error'}
@ -17,6 +17,7 @@ DIRECTORY_SETTINGS = '/settings/'
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
def init(bucket_name, target_key, target_data_source, target_file_name, log_info, mode): def init(bucket_name, target_key, target_data_source, target_file_name, log_info, mode):
@ -59,7 +60,8 @@ def init(bucket_name, target_key, target_data_source, target_file_name, log_info
try: try:
# ③ S3バケット内のtargetディレクトリに、「投入データファイル名.doing」ファイルを作成する # ③ S3バケット内のtargetディレクトリに、「投入データファイル名.doing」ファイルを作成する
s3_client.put_object(Bucket=bucket_name, Key=doing_key, Body=b'') doing_obj = s3_resource.Object(bucket_name, doing_key)
doing_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-04 - targetディレクトリに {doing_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-04 - targetディレクトリに {doing_file_name} を作成しました')
# ④ 投入データファイルをS3バケット内のtargetディレクトリから、workディレクトリに移動(コピー削除)する # ④ 投入データファイルをS3バケット内のtargetディレクトリから、workディレクトリに移動(コピー削除)する
@ -68,7 +70,8 @@ def init(bucket_name, target_key, target_data_source, target_file_name, log_info
'Key': target_key 'Key': target_key
} }
work_key = target_data_source + DIRECTORY_WORK + target_file_name work_key = target_data_source + DIRECTORY_WORK + target_file_name
s3_client.copy(CopySource=copy_source, Bucket=bucket_name, Key=work_key) work_obj = s3_resource.Object(bucket_name, work_key)
work_obj.copy(copy_source)
s3_client.delete_object(Bucket=bucket_name, Key=target_key) s3_client.delete_object(Bucket=bucket_name, Key=target_key)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-05 - 投入データ {target_file_name} をworkディレクトリに移動しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-05 - 投入データ {target_file_name} をworkディレクトリに移動しました')
except Exception as e: except Exception as e:
@ -119,8 +122,9 @@ def init(bucket_name, target_key, target_data_source, target_file_name, log_info
try: try:
# ⑦ 個別設定ファイルを特定する # ⑦ 個別設定ファイルを特定する
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-17 - 個別設定ファイルを検索します') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-17 - 個別設定ファイルを検索します')
mapping_obj_response = s3_client.get_object(Bucket=bucket_name, Key=mapping_key) mapping_obj = s3_resource.Object(bucket_name, mapping_key)
mapping_body = io.TextIOWrapper(io.BytesIO(mapping_obj_response["Body"].read()), encoding='utf-8') mapping_response = mapping_obj.get()
mapping_body = io.TextIOWrapper(io.BytesIO(mapping_response["Body"].read()), encoding='utf-8')
settings_file_name = '' settings_file_name = ''
for row in csv.reader(mapping_body, delimiter='\t'): for row in csv.reader(mapping_body, delimiter='\t'):
if row: if row:
@ -155,15 +159,3 @@ def init(bucket_name, target_key, target_data_source, target_file_name, log_info
except Exception as e: except Exception as e:
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-INI-99 - エラー内容:{e}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-INI-99 - エラー内容:{e}')
error(bucket_name, target_data_source, target_file_name, log_info) error(bucket_name, target_data_source, target_file_name, log_info)
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# init(
# bucket_name='バケット名',
# target_key='データソース名/target/ファイル名',
# target_data_source='データソース名',
# target_file_name='ファイル名',
# log_info='Info',
# mode='i'
# )

View File

@ -5,9 +5,10 @@ from datetime import datetime
import boto3 import boto3
import pymysql import pymysql
from pymysql.constants import CLIENT
from common import convert_quotechar, debug_log from common import convert_quotechar, debug_log
from error import error from error import error
from pymysql.constants import CLIENT
# 定数 # 定数
DIRECTORY_WORK = '/work/' DIRECTORY_WORK = '/work/'
@ -46,6 +47,7 @@ INVALID_CONFIG_EXCEPTION_MESSAGE = f'個別設定ファイルのインポート
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
def main(bucket_name, target_data_source, target_file_name, settings_key, db_info, log_info, mode): def main(bucket_name, target_data_source, target_file_name, settings_key, db_info, log_info, mode):
@ -89,7 +91,8 @@ def main(bucket_name, target_data_source, target_file_name, settings_key, db_inf
# ④ 個別設定ファイルのロードスキーマのテーブル名に記載されているテーブルをTRUNCATEする # ④ 個別設定ファイルのロードスキーマのテーブル名に記載されているテーブルをTRUNCATEする
# 個別設定ファイルの読み込み # 個別設定ファイルの読み込み
settings_response = s3_client.get_object(Bucket=bucket_name, Key=settings_key) settings_obj = s3_resource.Object(bucket_name, settings_key)
settings_response = settings_obj.get()
settings_list = [] settings_list = []
for line in io.TextIOWrapper(io.BytesIO(settings_response["Body"].read()), encoding='utf-8'): for line in io.TextIOWrapper(io.BytesIO(settings_response["Body"].read()), encoding='utf-8'):
settings_list.append(line.rstrip('\n')) settings_list.append(line.rstrip('\n'))
@ -107,7 +110,8 @@ def main(bucket_name, target_data_source, target_file_name, settings_key, db_inf
# ⑤ 投入データファイルを1行ごとにループする # ⑤ 投入データファイルを1行ごとにループする
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-MAIN-05 - 投入データ {target_file_name} の読み込みを開始します') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-MAIN-05 - 投入データ {target_file_name} の読み込みを開始します')
work_key = target_data_source + DIRECTORY_WORK + target_file_name work_key = target_data_source + DIRECTORY_WORK + target_file_name
work_response = s3_client.get_object(Bucket=bucket_name, Key=work_key) work_obj = s3_resource.Object(bucket_name, work_key)
work_response = work_obj.get()
work_data = io.TextIOWrapper(io.BytesIO(work_response["Body"].read()), encoding=settings_list[SETTINGS_ITEM["charCode"]], newline=LINE_FEED_CODE[settings_list[SETTINGS_ITEM["lineFeedCode"]]]) work_data = io.TextIOWrapper(io.BytesIO(work_response["Body"].read()), encoding=settings_list[SETTINGS_ITEM["charCode"]], newline=LINE_FEED_CODE[settings_list[SETTINGS_ITEM["lineFeedCode"]]])
process_count = 0 # 処理件数カウンタ process_count = 0 # 処理件数カウンタ
@ -257,9 +261,10 @@ def main(bucket_name, target_data_source, target_file_name, settings_key, db_inf
try: try:
if ex_sql_file_exists: if ex_sql_file_exists:
# 拡張SQLファイルからSQL文生成 # 拡張SQLファイルからSQL文生成
ex_sql_obj_response = s3_client.get_object(Bucket=bucket_name, Key=ex_sql_key) ex_sqls_obj = s3_resource.Object(bucket_name, ex_sql_key)
ex_sql_response = ex_sqls_obj.get()
ex_sql = '' ex_sql = ''
for line in io.TextIOWrapper(io.BytesIO(ex_sql_obj_response["Body"].read()), encoding='utf-8'): for line in io.TextIOWrapper(io.BytesIO(ex_sql_response["Body"].read()), encoding='utf-8'):
ex_sql = f'{ex_sql} {line.rstrip()}' ex_sql = f'{ex_sql} {line.rstrip()}'
# トランザクション開始 # トランザクション開始
@ -353,18 +358,3 @@ def truncate_judge(settings_list):
class InvalidConfigException(Exception): class InvalidConfigException(Exception):
pass pass
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# DB_INFO = {"host": '127.0.0.1', "name": 'org02', "pass": 'user', "user": 'user'}
# main(
# bucket_name='バケット名',
# target_data_source='投入データのディレクトリ名よりデータソースに該当する部分',
# target_file_name='投入データのファイル名',
# settings_key='投入データに該当する個別設定ファイルのフルパス',
# db_info=DB_INFO,
# log_info='info',
# mode='i'
# )

View File

@ -0,0 +1,2 @@
boto3
PyMySQL

View File

@ -1,12 +0,0 @@
tests/*
.coverage
.env
.env.example
.report/*
.vscode/*
.pytest_cache/*
*/__pychache__/*
Dockerfile
pytest.ini
README.md
*.sql

View File

@ -1,9 +0,0 @@
DB_HOST=************
DB_PORT=3306
DB_USERNAME=************
DB_PASSWORD=************
DB_SCHEMA=*****
DUMP_BACKUP_BUCKET=************
LOG_LEVEL=INFO

View File

@ -1,11 +0,0 @@
.vscode/settings.json
.env
my.cnf
# python
__pycache__
# python test
.pytest_cache
.coverage
.report/

View File

@ -1,16 +0,0 @@
{
// IntelliSense 使
//
// : https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "(DEBUG) export dbdump",
"type": "python",
"request": "launch",
"program": "entrypoint.py",
"console": "integratedTerminal",
"justMyCode": true
}
]
}

View File

@ -1,31 +0,0 @@
{
"[python]": {
"editor.defaultFormatter": null,
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.organizeImports": true
}
},
//
"python.defaultInterpreterPath": "<pythonインタプリターのパス>",
"python.linting.lintOnSave": true,
"python.linting.enabled": true,
"python.linting.pylintEnabled": false,
"python.linting.flake8Enabled": true,
"python.linting.flake8Args": [
"--max-line-length=200",
"--ignore=F541"
],
"python.formatting.provider": "autopep8",
"python.formatting.autopep8Path": "autopep8",
"python.formatting.autopep8Args": [
"--max-line-length", "200",
"--ignore=F541"
],
"python.testing.pytestArgs": [
"tests/batch/ultmarc"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
}

View File

@ -1,40 +0,0 @@
FROM python:3.12-slim-bookworm
ENV TZ="Asia/Tokyo"
# pythonの標準出力をバッファリングしないフラグ
ENV PYTHONUNBUFFERED=1
# pythonのバイトコードを生成しないフラグ
ENV PYTHONDONTWRITEBYTECODE=1
WORKDIR /usr/src/app
COPY Pipfile Pipfile.lock ./
# mysql-apt-config をdpkgでインストールする際に標準出力に渡す文字列ファイルをコピー
COPY mysql_dpkg_selection.txt ./
# 必要なパッケージインストール
RUN apt update && apt install -y less vim curl wget gzip unzip sudo lsb-release
# mysqlをインストール
RUN \
wget https://dev.mysql.com/get/mysql-apt-config_0.8.29-1_all.deb && \
apt install -y gnupg && \
dpkg -i mysql-apt-config_0.8.29-1_all.deb < mysql_dpkg_selection.txt && \
apt update && \
apt install -y mysql-client
# aws cli v2 のインストール
RUN \
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && \
unzip awscliv2.zip && \
sudo ./aws/install
# python関連のライブラリインストール
RUN \
pip install --upgrade pip wheel setuptools && \
pip install pipenv --no-cache-dir && \
pipenv install --system --deploy && \
pip uninstall -y pipenv virtualenv-clone virtualenv
COPY src ./src
COPY entrypoint.py entrypoint.py
CMD ["python", "entrypoint.py"]

View File

@ -1,16 +0,0 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
[dev-packages]
autopep8 = "*"
flake8 = "*"
[requires]
python_version = "3.12"
[pipenv]
allow_prereleases = true

View File

@ -1,63 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "2f7808325e11704ced6ad10c85e1d583663a03d7ccabaa9696ab1fe133a6b30c"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.12"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {},
"develop": {
"autopep8": {
"hashes": [
"sha256:89440a4f969197b69a995e4ce0661b031f455a9f776d2c5ba3dbd83466931758",
"sha256:ce8ad498672c845a0c3de2629c15b635ec2b05ef8177a6e7c91c74f3e9b51128"
],
"index": "pypi",
"markers": "python_version >= '3.9'",
"version": "==2.3.2"
},
"flake8": {
"hashes": [
"sha256:1cbc62e65536f65e6d754dfe6f1bada7f5cf392d6f5db3c2b85892466c3e7c1a",
"sha256:c586ffd0b41540951ae41af572e6790dbd49fc12b3aa2541685d253d9bd504bd"
],
"index": "pypi",
"markers": "python_full_version >= '3.8.1'",
"version": "==7.1.2"
},
"mccabe": {
"hashes": [
"sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325",
"sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"
],
"markers": "python_version >= '3.6'",
"version": "==0.7.0"
},
"pycodestyle": {
"hashes": [
"sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3",
"sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521"
],
"markers": "python_version >= '3.8'",
"version": "==2.12.1"
},
"pyflakes": {
"hashes": [
"sha256:1c61603ff154621fb2a9172037d84dca3500def8c8b630657d1701f026f8af3f",
"sha256:84b5be138a2dfbb40689ca07e2152deb896a65c3a3e24c251c5c62489568074a"
],
"markers": "python_version >= '3.8'",
"version": "==3.2.0"
}
}
}

View File

@ -1,48 +0,0 @@
# 【共通】DBダンプ取得 
## 概要
当処理は特定の機能で利用するものではなく、共通処理として要件に応じて実行することを想定している。
## 環境情報
- Python 3.9
- MySQL 8.23
- VSCode
## 環境構築
- Python の構築
- Merck_NewDWH 開発 2021 の Wiki、[Python 環境構築](https://nds-tyo.backlog.com/alias/wiki/1874930)を参照
- 「Pipenv の導入」までを行っておくこと
- 構築完了後、プロジェクト配下で以下のコマンドを実行し、Python の仮想環境を作成する
- `pipenv install --dev --python <pyenvでインストールしたpythonバージョン>`
- この手順で出力される仮想環境のパスは、後述する VSCode の設定手順で使用するため、控えておく
- MySQL の環境構築
- Windows の場合、以下のリンクからダウンロードする
- <https://dev.mysql.com/downloads/installer/>
- Docker を利用する場合、「newsdwh-tools」リポジトリの MySQL 設定を使用すると便利
- 「crm-table-to-ddl」フォルダ内で以下のコマンドを実行すると
- `docker-compose up -d`
- Docker の構築手順は、[Docker のセットアップ手順](https://nds-tyo.backlog.com/alias/wiki/1754332)を参照のこと
- データを投入する
- 立ち上げたデータベースに「src05」スキーマを作成する
- [ローカル開発用データ](https://ndstokyo.sharepoint.com/:f:/r/sites/merck-new-dwh-team/Shared%20Documents/03.NewDWH%E6%A7%8B%E7%AF%89%E3%83%95%E3%82%A7%E3%83%BC%E3%82%BA3/02.%E9%96%8B%E7%99%BA/90.%E9%96%8B%E7%99%BA%E5%85%B1%E6%9C%89/%E3%83%AD%E3%83%BC%E3%82%AB%E3%83%AB%E9%96%8B%E7%99%BA%E7%94%A8%E3%83%87%E3%83%BC%E3%82%BF?csf=1&web=1&e=VVcRUs)をダウンロードし、mysql コマンドを使用して復元する
- `mysql -h <ホスト名> -P <ポート> -u <ユーザー名> -p src05 < src05_dump.sql`
- 環境変数の設定
- 「.env.example」ファイルをコピーし、「.env」ファイルを作成する
- 環境変数を設定する。設定内容は PRJ メンバーより共有を受けてください
- VSCode の設定
- 「.vscode/recommended_settings.json」ファイルをコピーし、「settings.json」ファイルを作成する
- 「python.defaultInterpreterPath」を、Python の構築手順で作成した仮想環境のパスに変更する
## 実行
- VSCode 上で「F5」キーを押下すると、バッチ処理が起動する。
- 「entrypoint.py」が、バッチ処理のエントリーポイント。
- 実際の処理は、「src/jobctrl_dbdump.py」で行っている。
## フォルダ構成(工事中)

View File

@ -1,10 +0,0 @@
"""【共通】DBダンプ取得処理のエントリーポイント"""
from src import jobctrl_dbdump
if __name__ == '__main__':
try:
exit(jobctrl_dbdump.exec())
except Exception:
# エラーが起きても、正常系のコードで返す。
# エラーが起きた事実はbatch_process内でログを出す。
exit(0)

View File

@ -1,3 +0,0 @@
1
1
4

View File

@ -1,106 +0,0 @@
"""DBダンプ取得"""
import datetime
import os
import subprocess
import textwrap
from src.logging.get_logger import get_logger
from src.system_var import constants, environment
logger = get_logger('DBダンプ取得')
def exec():
try:
logger.info('DBダンプ取得開始')
# 事前処理(共通処理としては空振りする)
_pre_exec()
# メイン処理
# MySQL接続情報を作成する
my_cnf_file_content = f"""
[client]
user={environment.DB_USERNAME}
password={environment.DB_PASSWORD}
host={environment.DB_HOST}
"""
# my.cnfファイルのパス
my_cnf_path = os.path.join('my.cnf')
# my.cnfファイルを生成する
with open(my_cnf_path, 'w') as f:
f.write(textwrap.dedent(my_cnf_file_content)[1:-1])
# ファイルのパーミッションが強いとmysqldumpコマンドが実行できないため
# my.cnfファイルのパーミッションをread-onlyに設定
os.chmod(my_cnf_path, 0o444)
dt_now = datetime.datetime.now()
converted_value = dt_now.strftime('%Y%m%d%H%M%S%f')
dump_file_name = f'backup_rds_{environment.DB_SCHEMA}_{converted_value}.gz'
s3_file_path = f's3://{environment.DUMP_BACKUP_BUCKET}/{constants.DUMP_BACKUP_FOLDER}/{dt_now.year}/{dt_now.strftime("%m")}/{dt_now.strftime("%d")}/{dump_file_name}'
# mysqldumpコマンドを実行し、dumpを取得する
command = [
'mysqldump',
f'--defaults-file={my_cnf_path}',
'-P',
f"{environment.DB_PORT}",
'--no-tablespaces',
'--skip-column-statistics',
'--single-transaction',
'--set-gtid-purged=OFF',
environment.DB_SCHEMA
]
mysqldump_process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# gzipコマンドを実行してdump結果を圧縮する
gzip_process = subprocess.Popen(['gzip', '-c'], stdin=mysqldump_process.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# aws s3 cpコマンドを実行してアップロードする
s3_cp_process = subprocess.Popen(['aws', 's3', 'cp', '-', s3_file_path], stdin=gzip_process.stdout, stderr=subprocess.PIPE)
# mysqldumpの標準出力をgzipに接続したため、標準出力をクローズする
mysqldump_process.stdout.close()
# gzipの標準出力をaws s3 cpに接続したため、標準出力をクローズする
gzip_process.stdout.close()
# パイプラインを実行し、エラーハンドリング
_, error = mysqldump_process.communicate()
if mysqldump_process.returncode != 0:
raise Exception(f'`mysqldump`実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
_, error = gzip_process.communicate()
if gzip_process.returncode != 0:
raise Exception(f'`gzip`実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
_, error = s3_cp_process.communicate()
if s3_cp_process.returncode != 0:
raise Exception(f'`aws s3 cp`実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
# 事後処理(共通処理としては空振りする)
_post_exec()
logger.info('DBダンプ取得終了正常終了')
logger.info(f'出力ファイルパス: {s3_file_path}')
return constants.BATCH_EXIT_CODE_SUCCESS
except Exception as e:
logger.exception(f'DBダンプ取得中に想定外のエラーが発生しました :{e}')
return constants.BATCH_EXIT_CODE_SUCCESS
def _pre_exec():
"""
ダンプ復元 事前処理
共通機能としては事前処理を実装しない
事前処理が必要なダンプ復元処理を実装する場合当ロジックをコピーする
"""
pass
def _post_exec():
"""
ダンプ復元 事後処理
共通機能としては事後処理を実装しない
事後処理が必要なダンプ復元処理を実装する場合当ロジックをコピーする
"""
pass

View File

@ -1,37 +0,0 @@
import logging
from src.system_var.environment import LOG_LEVEL
# boto3関連モジュールのログレベルを事前に個別指定し、モジュール内のDEBUGログの表示を抑止する
for name in ["boto3", "botocore", "s3transfer", "urllib3"]:
logging.getLogger(name).setLevel(logging.WARNING)
def get_logger(log_name: str) -> logging.Logger:
"""一意のログ出力モジュールを取得します。
Args:
log_name (str): ロガー名
Returns:
_type_: _description_
"""
logger = logging.getLogger(log_name)
level = logging.getLevelName(LOG_LEVEL)
if not isinstance(level, int):
level = logging.INFO
logger.setLevel(level)
if not logger.hasHandlers():
handler = logging.StreamHandler()
logger.addHandler(handler)
formatter = logging.Formatter(
'%(name)s\t[%(levelname)s]\t%(asctime)s\t%(message)s',
'%Y-%m-%d %H:%M:%S'
)
for handler in logger.handlers:
handler.setFormatter(formatter)
return logger

View File

@ -1,5 +0,0 @@
# バッチ正常終了コード
BATCH_EXIT_CODE_SUCCESS = 0
# ダンプバックアップフォルダー
DUMP_BACKUP_FOLDER = 'dump'

View File

@ -1,19 +0,0 @@
import os
# Database
DB_HOST = os.environ['DB_HOST']
DB_PORT = int(os.environ['DB_PORT'])
DB_USERNAME = os.environ['DB_USERNAME']
DB_PASSWORD = os.environ['DB_PASSWORD']
DB_SCHEMA = os.environ['DB_SCHEMA']
# AWS
DUMP_BACKUP_BUCKET = os.environ['DUMP_BACKUP_BUCKET']
# 初期値がある環境変数
LOG_LEVEL = os.environ.get('LOG_LEVEL', 'INFO')
DB_CONNECTION_MAX_RETRY_ATTEMPT = int(os.environ.get('DB_CONNECTION_MAX_RETRY_ATTEMPT', 4))
DB_CONNECTION_RETRY_INTERVAL_INIT = int(os.environ.get('DB_CONNECTION_RETRY_INTERVAL', 5))
DB_CONNECTION_RETRY_INTERVAL_MIN_SECONDS = int(os.environ.get('DB_CONNECTION_RETRY_MIN_SECONDS', 5))
DB_CONNECTION_RETRY_INTERVAL_MAX_SECONDS = int(os.environ.get('DB_CONNECTION_RETRY_MAX_SECONDS', 50))

View File

@ -18,20 +18,83 @@
"default": { "default": {
"boto3": { "boto3": {
"hashes": [ "hashes": [
"sha256:3faa2c328a61745f3215a63039606a6fcf55d9afe1cc76e3a5e27b9db58cdbf6", "sha256:791523f41b5e731c8ac0d2f65b978348fb6c92f02e0dbc9a7bc0b3760195cc60",
"sha256:b998edac72f6740bd5d9d585cf3880f2dfeb4842e626b34430fd0e9623378011" "sha256:795803812b78260cfe019ef65021ec9e8049bfb6a027564e1a3b59c1a4a11106"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.9'", "version": "==1.34.24"
"version": "==1.38.32"
}, },
"botocore": { "botocore": {
"hashes": [ "hashes": [
"sha256:0899a090e352cb5eeaae2c7bb52a987b469d23912c7ece86664dfb5c2e074978", "sha256:4a48c15b87c6a72719a6c2d8688f4e52fe52c18ac9dfcaa25c7e62c5df475ee2",
"sha256:64ab919a5d8b74dd73eaac1f978d0e674d11ff3bbe8815c3d2982477be9a082c" "sha256:c92f810b5faec5126f3faf7dc7d77346e407ab3b89bb613290c86ff2fd5405b9"
], ],
"markers": "python_version >= '3.9'", "markers": "python_version >= '3.8'",
"version": "==1.38.32" "version": "==1.34.24"
},
"greenlet": {
"hashes": [
"sha256:01bc7ea167cf943b4c802068e178bbf70ae2e8c080467070d01bfa02f337ee67",
"sha256:0448abc479fab28b00cb472d278828b3ccca164531daab4e970a0458786055d6",
"sha256:086152f8fbc5955df88382e8a75984e2bb1c892ad2e3c80a2508954e52295257",
"sha256:098d86f528c855ead3479afe84b49242e174ed262456c342d70fc7f972bc13c4",
"sha256:149e94a2dd82d19838fe4b2259f1b6b9957d5ba1b25640d2380bea9c5df37676",
"sha256:1551a8195c0d4a68fac7a4325efac0d541b48def35feb49d803674ac32582f61",
"sha256:15d79dd26056573940fcb8c7413d84118086f2ec1a8acdfa854631084393efcc",
"sha256:1996cb9306c8595335bb157d133daf5cf9f693ef413e7673cb07e3e5871379ca",
"sha256:1a7191e42732df52cb5f39d3527217e7ab73cae2cb3694d241e18f53d84ea9a7",
"sha256:1ea188d4f49089fc6fb283845ab18a2518d279c7cd9da1065d7a84e991748728",
"sha256:1f672519db1796ca0d8753f9e78ec02355e862d0998193038c7073045899f305",
"sha256:2516a9957eed41dd8f1ec0c604f1cdc86758b587d964668b5b196a9db5bfcde6",
"sha256:2797aa5aedac23af156bbb5a6aa2cd3427ada2972c828244eb7d1b9255846379",
"sha256:2dd6e660effd852586b6a8478a1d244b8dc90ab5b1321751d2ea15deb49ed414",
"sha256:3ddc0f794e6ad661e321caa8d2f0a55ce01213c74722587256fb6566049a8b04",
"sha256:3ed7fb269f15dc662787f4119ec300ad0702fa1b19d2135a37c2c4de6fadfd4a",
"sha256:419b386f84949bf0e7c73e6032e3457b82a787c1ab4a0e43732898a761cc9dbf",
"sha256:43374442353259554ce33599da8b692d5aa96f8976d567d4badf263371fbe491",
"sha256:52f59dd9c96ad2fc0d5724107444f76eb20aaccb675bf825df6435acb7703559",
"sha256:57e8974f23e47dac22b83436bdcf23080ade568ce77df33159e019d161ce1d1e",
"sha256:5b51e85cb5ceda94e79d019ed36b35386e8c37d22f07d6a751cb659b180d5274",
"sha256:649dde7de1a5eceb258f9cb00bdf50e978c9db1b996964cd80703614c86495eb",
"sha256:64d7675ad83578e3fc149b617a444fab8efdafc9385471f868eb5ff83e446b8b",
"sha256:68834da854554926fbedd38c76e60c4a2e3198c6fbed520b106a8986445caaf9",
"sha256:6b66c9c1e7ccabad3a7d037b2bcb740122a7b17a53734b7d72a344ce39882a1b",
"sha256:70fb482fdf2c707765ab5f0b6655e9cfcf3780d8d87355a063547b41177599be",
"sha256:7170375bcc99f1a2fbd9c306f5be8764eaf3ac6b5cb968862cad4c7057756506",
"sha256:73a411ef564e0e097dbe7e866bb2dda0f027e072b04da387282b02c308807405",
"sha256:77457465d89b8263bca14759d7c1684df840b6811b2499838cc5b040a8b5b113",
"sha256:7f362975f2d179f9e26928c5b517524e89dd48530a0202570d55ad6ca5d8a56f",
"sha256:81bb9c6d52e8321f09c3d165b2a78c680506d9af285bfccbad9fb7ad5a5da3e5",
"sha256:881b7db1ebff4ba09aaaeae6aa491daeb226c8150fc20e836ad00041bcb11230",
"sha256:894393ce10ceac937e56ec00bb71c4c2f8209ad516e96033e4b3b1de270e200d",
"sha256:99bf650dc5d69546e076f413a87481ee1d2d09aaaaaca058c9251b6d8c14783f",
"sha256:9da2bd29ed9e4f15955dd1595ad7bc9320308a3b766ef7f837e23ad4b4aac31a",
"sha256:afaff6cf5200befd5cec055b07d1c0a5a06c040fe5ad148abcd11ba6ab9b114e",
"sha256:b1b5667cced97081bf57b8fa1d6bfca67814b0afd38208d52538316e9422fc61",
"sha256:b37eef18ea55f2ffd8f00ff8fe7c8d3818abd3e25fb73fae2ca3b672e333a7a6",
"sha256:b542be2440edc2d48547b5923c408cbe0fc94afb9f18741faa6ae970dbcb9b6d",
"sha256:b7dcbe92cc99f08c8dd11f930de4d99ef756c3591a5377d1d9cd7dd5e896da71",
"sha256:b7f009caad047246ed379e1c4dbcb8b020f0a390667ea74d2387be2998f58a22",
"sha256:bba5387a6975598857d86de9eac14210a49d554a77eb8261cc68b7d082f78ce2",
"sha256:c5e1536de2aad7bf62e27baf79225d0d64360d4168cf2e6becb91baf1ed074f3",
"sha256:c5ee858cfe08f34712f548c3c363e807e7186f03ad7a5039ebadb29e8c6be067",
"sha256:c9db1c18f0eaad2f804728c67d6c610778456e3e1cc4ab4bbd5eeb8e6053c6fc",
"sha256:d353cadd6083fdb056bb46ed07e4340b0869c305c8ca54ef9da3421acbdf6881",
"sha256:d46677c85c5ba00a9cb6f7a00b2bfa6f812192d2c9f7d9c4f6a55b60216712f3",
"sha256:d4d1ac74f5c0c0524e4a24335350edad7e5f03b9532da7ea4d3c54d527784f2e",
"sha256:d73a9fe764d77f87f8ec26a0c85144d6a951a6c438dfe50487df5595c6373eac",
"sha256:da70d4d51c8b306bb7a031d5cff6cc25ad253affe89b70352af5f1cb68e74b53",
"sha256:daf3cb43b7cf2ba96d614252ce1684c1bccee6b2183a01328c98d36fcd7d5cb0",
"sha256:dca1e2f3ca00b84a396bc1bce13dd21f680f035314d2379c4160c98153b2059b",
"sha256:dd4f49ae60e10adbc94b45c0b5e6a179acc1736cf7a90160b404076ee283cf83",
"sha256:e1f145462f1fa6e4a4ae3c0f782e580ce44d57c8f2c7aae1b6fa88c0b2efdb41",
"sha256:e3391d1e16e2a5a1507d83e4a8b100f4ee626e8eca43cf2cadb543de69827c4c",
"sha256:fcd2469d6a2cf298f198f0487e0a5b1a47a42ca0fa4dfd1b6862c999f018ebbf",
"sha256:fd096eb7ffef17c456cfa587523c5f92321ae02427ff955bebe9e3c63bc9f0da",
"sha256:fe754d231288e1e64323cfad462fcee8f0288654c10bdf4f603a39ed923bef33"
],
"markers": "platform_machine == 'aarch64' or (platform_machine == 'ppc64le' or (platform_machine == 'x86_64' or (platform_machine == 'amd64' or (platform_machine == 'AMD64' or (platform_machine == 'win32' or platform_machine == 'WIN32')))))",
"version": "==3.0.3"
}, },
"jmespath": { "jmespath": {
"hashes": [ "hashes": [
@ -43,241 +106,217 @@
}, },
"pymysql": { "pymysql": {
"hashes": [ "hashes": [
"sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c", "sha256:4f13a7df8bf36a51e81dd9f3605fede45a4878fe02f9236349fd82a3f0612f96",
"sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0" "sha256:8969ec6d763c856f7073c4c64662882675702efcb114b4bcbb955aea3a069fa7"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.7'", "version": "==1.1.0"
"version": "==1.1.1"
}, },
"python-dateutil": { "python-dateutil": {
"hashes": [ "hashes": [
"sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86",
"sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427" "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"
], ],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==2.9.0.post0" "version": "==2.8.2"
}, },
"s3transfer": { "s3transfer": {
"hashes": [ "hashes": [
"sha256:0148ef34d6dd964d0d8cf4311b2b21c474693e57c2e069ec708ce043d2b527be", "sha256:3cdb40f5cfa6966e812209d0994f2a4709b561c88e90cf00c2696d2df4e56b2e",
"sha256:f5e6db74eb7776a37208001113ea7aa97695368242b364d73e91c981ac522177" "sha256:d0c8bbf672d5eebbe4e57945e23b972d963f07d82f661cabf678a5c88831595b"
], ],
"markers": "python_version >= '3.9'", "markers": "python_version >= '3.8'",
"version": "==0.13.0" "version": "==0.10.0"
}, },
"six": { "six": {
"hashes": [ "hashes": [
"sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926",
"sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81" "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"
], ],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==1.17.0" "version": "==1.16.0"
}, },
"sqlalchemy": { "sqlalchemy": {
"hashes": [ "hashes": [
"sha256:023b3ee6169969beea3bb72312e44d8b7c27c75b347942d943cf49397b7edeb5", "sha256:0d3cab3076af2e4aa5693f89622bef7fa770c6fec967143e4da7508b3dceb9b9",
"sha256:03968a349db483936c249f4d9cd14ff2c296adfa1290b660ba6516f973139582", "sha256:0dacf67aee53b16f365c589ce72e766efaabd2b145f9de7c917777b575e3659d",
"sha256:05132c906066142103b83d9c250b60508af556982a385d96c4eaa9fb9720ac2b", "sha256:10331f129982a19df4284ceac6fe87353ca3ca6b4ca77ff7d697209ae0a5915e",
"sha256:087b6b52de812741c27231b5a3586384d60c353fbd0e2f81405a814b5591dc8b", "sha256:14a6f68e8fc96e5e8f5647ef6cda6250c780612a573d99e4d881581432ef1669",
"sha256:0b3dbf1e7e9bc95f4bac5e2fb6d3fb2f083254c3fdd20a1789af965caf2d2348", "sha256:1b1180cda6df7af84fe72e4530f192231b1f29a7496951db4ff38dac1687202d",
"sha256:118c16cd3f1b00c76d69343e38602006c9cfb9998fa4f798606d28d63f23beda", "sha256:29049e2c299b5ace92cbed0c1610a7a236f3baf4c6b66eb9547c01179f638ec5",
"sha256:1936af879e3db023601196a1684d28e12f19ccf93af01bf3280a3262c4b6b4e5", "sha256:342d365988ba88ada8af320d43df4e0b13a694dbd75951f537b2d5e4cb5cd002",
"sha256:1e3f196a0c59b0cae9a0cd332eb1a4bda4696e863f4f1cf84ab0347992c548c2", "sha256:420362338681eec03f53467804541a854617faed7272fe71a1bfdb07336a381e",
"sha256:23a8825495d8b195c4aa9ff1c430c28f2c821e8c5e2d98089228af887e5d7e29", "sha256:4344d059265cc8b1b1be351bfb88749294b87a8b2bbe21dfbe066c4199541ebd",
"sha256:293cd444d82b18da48c9f71cd7005844dbbd06ca19be1ccf6779154439eec0b8", "sha256:4f7a7d7fcc675d3d85fbf3b3828ecd5990b8d61bd6de3f1b260080b3beccf215",
"sha256:32f9dc8c44acdee06c8fc6440db9eae8b4af8b01e4b1aee7bdd7241c22edff4f", "sha256:555651adbb503ac7f4cb35834c5e4ae0819aab2cd24857a123370764dc7d7e24",
"sha256:34ea30ab3ec98355235972dadc497bb659cc75f8292b760394824fab9cf39826", "sha256:59a21853f5daeb50412d459cfb13cb82c089ad4c04ec208cd14dddd99fc23b39",
"sha256:3d3549fc3e40667ec7199033a4e40a2f669898a00a7b18a931d3efb4c7900504", "sha256:5fdd402169aa00df3142149940b3bf9ce7dde075928c1886d9a1df63d4b8de62",
"sha256:41836fe661cc98abfae476e14ba1906220f92c4e528771a8a3ae6a151242d2ae", "sha256:605b6b059f4b57b277f75ace81cc5bc6335efcbcc4ccb9066695e515dbdb3900",
"sha256:4d44522480e0bf34c3d63167b8cfa7289c1c54264c2950cc5fc26e7850967e45", "sha256:665f0a3954635b5b777a55111ababf44b4fc12b1f3ba0a435b602b6387ffd7cf",
"sha256:4eeb195cdedaf17aab6b247894ff2734dcead6c08f748e617bfe05bd5a218443", "sha256:6f9e2e59cbcc6ba1488404aad43de005d05ca56e069477b33ff74e91b6319735",
"sha256:4f67766965996e63bb46cfbf2ce5355fc32d9dd3b8ad7e536a920ff9ee422e23", "sha256:736ea78cd06de6c21ecba7416499e7236a22374561493b456a1f7ffbe3f6cdb4",
"sha256:57df5dc6fdb5ed1a88a1ed2195fd31927e705cad62dedd86b46972752a80f576", "sha256:74b080c897563f81062b74e44f5a72fa44c2b373741a9ade701d5f789a10ba23",
"sha256:598d9ebc1e796431bbd068e41e4de4dc34312b7aa3292571bb3674a0cb415dd1", "sha256:75432b5b14dc2fff43c50435e248b45c7cdadef73388e5610852b95280ffd0e9",
"sha256:5b14e97886199c1f52c14629c11d90c11fbb09e9334fa7bb5f6d068d9ced0ce0", "sha256:75f99202324383d613ddd1f7455ac908dca9c2dd729ec8584c9541dd41822a2c",
"sha256:5e22575d169529ac3e0a120cf050ec9daa94b6a9597993d1702884f6954a7d71", "sha256:790f533fa5c8901a62b6fef5811d48980adeb2f51f1290ade8b5e7ba990ba3de",
"sha256:60c578c45c949f909a4026b7807044e7e564adf793537fc762b2489d522f3d11", "sha256:798f717ae7c806d67145f6ae94dc7c342d3222d3b9a311a784f371a4333212c7",
"sha256:6145afea51ff0af7f2564a05fa95eb46f542919e6523729663a5d285ecb3cf5e", "sha256:7c88f0c7dcc5f99bdb34b4fd9b69b93c89f893f454f40219fe923a3a2fd11625",
"sha256:6375cd674fe82d7aa9816d1cb96ec592bac1726c11e0cafbf40eeee9a4516b5f", "sha256:7d505815ac340568fd03f719446a589162d55c52f08abd77ba8964fbb7eb5b5f",
"sha256:6854175807af57bdb6425e47adbce7d20a4d79bbfd6f6d6519cd10bb7109a7f8", "sha256:84daa0a2055df9ca0f148a64fdde12ac635e30edbca80e87df9b3aaf419e144a",
"sha256:6ab60a5089a8f02009f127806f777fca82581c49e127f08413a66056bd9166dd", "sha256:87d91043ea0dc65ee583026cb18e1b458d8ec5fc0a93637126b5fc0bc3ea68c4",
"sha256:725875a63abf7c399d4548e686debb65cdc2549e1825437096a0af1f7e374814", "sha256:87f6e732bccd7dcf1741c00f1ecf33797383128bd1c90144ac8adc02cbb98643",
"sha256:7492967c3386df69f80cf67efd665c0f667cee67032090fe01d7d74b0e19bb08", "sha256:884272dcd3ad97f47702965a0e902b540541890f468d24bd1d98bcfe41c3f018",
"sha256:81965cc20848ab06583506ef54e37cf15c83c7e619df2ad16807c03100745dea", "sha256:8b8cb63d3ea63b29074dcd29da4dc6a97ad1349151f2d2949495418fd6e48db9",
"sha256:81c24e0c0fde47a9723c81d5806569cddef103aebbf79dbc9fcbb617153dea30", "sha256:91f7d9d1c4dd1f4f6e092874c128c11165eafcf7c963128f79e28f8445de82d5",
"sha256:81eedafa609917040d39aa9332e25881a8e7a0862495fcdf2023a9667209deda", "sha256:a2c69a7664fb2d54b8682dd774c3b54f67f84fa123cf84dda2a5f40dcaa04e08",
"sha256:81f413674d85cfd0dfcd6512e10e0f33c19c21860342a4890c3a2b59479929f9", "sha256:a3be4987e3ee9d9a380b66393b77a4cd6d742480c951a1c56a23c335caca4ce3",
"sha256:8280856dd7c6a68ab3a164b4a4b1c51f7691f6d04af4d4ca23d6ecf2261b7923", "sha256:a86b4240e67d4753dc3092d9511886795b3c2852abe599cffe108952f7af7ac3",
"sha256:82ca366a844eb551daff9d2e6e7a9e5e76d2612c8564f58db6c19a726869c1df", "sha256:aa9373708763ef46782d10e950b49d0235bfe58facebd76917d3f5cbf5971aed",
"sha256:8b4af17bda11e907c51d10686eda89049f9ce5669b08fbe71a29747f1e876036", "sha256:b64b183d610b424a160b0d4d880995e935208fc043d0302dd29fee32d1ee3f95",
"sha256:90144d3b0c8b139408da50196c5cad2a6909b51b23df1f0538411cd23ffa45d3", "sha256:b801154027107461ee992ff4b5c09aa7cc6ec91ddfe50d02bca344918c3265c6",
"sha256:906e6b0d7d452e9a98e5ab8507c0da791856b2380fdee61b765632bb8698026f", "sha256:bb209a73b8307f8fe4fe46f6ad5979649be01607f11af1eb94aa9e8a3aaf77f0",
"sha256:90c11ceb9a1f482c752a71f203a81858625d8df5746d787a4786bca4ffdf71c6", "sha256:bc8b7dabe8e67c4832891a5d322cec6d44ef02f432b4588390017f5cec186a84",
"sha256:911cc493ebd60de5f285bcae0491a60b4f2a9f0f5c270edd1c4dbaef7a38fc04", "sha256:c51db269513917394faec5e5c00d6f83829742ba62e2ac4fa5c98d58be91662f",
"sha256:9a420a91913092d1e20c86a2f5f1fc85c1a8924dbcaf5e0586df8aceb09c9cc2", "sha256:c55731c116806836a5d678a70c84cb13f2cedba920212ba7dcad53260997666d",
"sha256:9f8c9fdd15a55d9465e590a402f42082705d66b05afc3ffd2d2eb3c6ba919560", "sha256:cf18ff7fc9941b8fc23437cc3e68ed4ebeff3599eec6ef5eebf305f3d2e9a7c2",
"sha256:a104c5694dfd2d864a6f91b0956eb5d5883234119cb40010115fd45a16da5e70", "sha256:d24f571990c05f6b36a396218f251f3e0dda916e0c687ef6fdca5072743208f5",
"sha256:a373a400f3e9bac95ba2a06372c4fd1412a7cee53c37fc6c05f829bf672b8769", "sha256:db854730a25db7c956423bb9fb4bdd1216c839a689bf9cc15fada0a7fb2f4570",
"sha256:a62448526dd9ed3e3beedc93df9bb6b55a436ed1474db31a2af13b313a70a7e1", "sha256:dc55990143cbd853a5d038c05e79284baedf3e299661389654551bd02a6a68d7",
"sha256:a8808d5cf866c781150d36a3c8eb3adccfa41a8105d031bf27e92c251e3969d6", "sha256:e607cdd99cbf9bb80391f54446b86e16eea6ad309361942bf88318bcd452363c",
"sha256:b1f09b6821406ea1f94053f346f28f8215e293344209129a9c0fcc3578598d7b", "sha256:ecf6d4cda1f9f6cb0b45803a01ea7f034e2f1aed9475e883410812d9f9e3cfcf",
"sha256:b2ac41acfc8d965fb0c464eb8f44995770239668956dc4cdf502d1b1ffe0d747", "sha256:f2a159111a0f58fb034c93eeba211b4141137ec4b0a6e75789ab7a3ef3c7e7e3",
"sha256:b46fa6eae1cd1c20e6e6f44e19984d438b6b2d8616d21d783d150df714f44078", "sha256:f37c0caf14b9e9b9e8f6dbc81bc56db06acb4363eba5a633167781a48ef036ed",
"sha256:b50eab9994d64f4a823ff99a0ed28a6903224ddbe7fef56a6dd865eec9243440", "sha256:f5693145220517b5f42393e07a6898acdfe820e136c98663b971906120549da5"
"sha256:bfc9064f6658a3d1cadeaa0ba07570b83ce6801a1314985bf98ec9b95d74e15f",
"sha256:c0b0e5e1b5d9f3586601048dd68f392dc0cc99a59bb5faf18aab057ce00d00b2",
"sha256:c153265408d18de4cc5ded1941dcd8315894572cddd3c58df5d5b5705b3fa28d",
"sha256:d4ae769b9c1c7757e4ccce94b0641bc203bbdf43ba7a2413ab2523d8d047d8dc",
"sha256:dc56c9788617b8964ad02e8fcfeed4001c1f8ba91a9e1f31483c0dffb207002a",
"sha256:dd5ec3aa6ae6e4d5b5de9357d2133c07be1aff6405b136dad753a16afb6717dd",
"sha256:edba70118c4be3c2b1f90754d308d0b79c6fe2c0fdc52d8ddf603916f83f4db9",
"sha256:ff8e80c4c4932c10493ff97028decfdb622de69cae87e0f127a7ebe32b4069c6"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.7'", "version": "==2.0.25"
"version": "==2.0.41"
}, },
"tenacity": { "tenacity": {
"hashes": [ "hashes": [
"sha256:1169d376c297e7de388d18b4481760d478b0e99a777cad3a9c86e556f4b697cb", "sha256:5398ef0d78e63f40007c1fb4c0bff96e1911394d2fa8d194f77619c05ff6cc8a",
"sha256:f77bf36710d8b73a50b2dd155c97b870017ad21afe6ab300326b0371b3b05138" "sha256:ce510e327a630c9e1beaf17d42e6ffacc88185044ad85cf74c0a8887c6a0f88c"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.9'", "version": "==8.2.3"
"version": "==9.1.2"
}, },
"typing-extensions": { "typing-extensions": {
"hashes": [ "hashes": [
"sha256:8676b788e32f02ab42d9e7c61324048ae4c6d844a399eebace3d4979d75ceef4", "sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783",
"sha256:a1514509136dd0b477638fc68d6a91497af5076466ad0fa6c338e44e359944af" "sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd"
], ],
"markers": "python_version >= '3.9'", "markers": "python_version >= '3.8'",
"version": "==4.14.0" "version": "==4.9.0"
}, },
"urllib3": { "urllib3": {
"hashes": [ "hashes": [
"sha256:0ed14ccfbf1c30a9072c7ca157e4319b70d65f623e91e7b32fadb2853431016e", "sha256:34b97092d7e0a3a8cf7cd10e386f401b3737364026c45e622aa02903dffe0f07",
"sha256:40c2dc0c681e47eb8f90e7e27bf6ff7df2e677421fd46756da1161c39ca70d32" "sha256:f8ecc1bba5667413457c529ab955bf8c67b45db799d159066261719e328580a0"
], ],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'", "markers": "python_version < '3.10'",
"version": "==1.26.20" "version": "==1.26.18"
} }
}, },
"develop": { "develop": {
"autopep8": { "autopep8": {
"hashes": [ "hashes": [
"sha256:8d6c87eba648fdcfc83e29b788910b8643171c395d9c4bcf115ece035b9c9dda", "sha256:067959ca4a07b24dbd5345efa8325f5f58da4298dab0dde0443d5ed765de80cb",
"sha256:a203fe0fcad7939987422140ab17a930f684763bf7335bdb6709991dd7ef6c2d" "sha256:2913064abd97b3419d1cc83ea71f042cb821f87e45b9c88cad5ad3c4ea87fe0c"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.8'", "version": "==2.0.4"
"version": "==2.3.1"
}, },
"boto3": { "boto3": {
"hashes": [ "hashes": [
"sha256:9edf49640c79a05b0a72f4c2d1e24dfc164344b680535a645f455ac624dc3680", "sha256:791523f41b5e731c8ac0d2f65b978348fb6c92f02e0dbc9a7bc0b3760195cc60",
"sha256:db58348849a5af061f0f5ec9c3b699da5221ca83354059fdccb798e3ddb6b62a" "sha256:795803812b78260cfe019ef65021ec9e8049bfb6a027564e1a3b59c1a4a11106"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.8'", "version": "==1.34.24"
"version": "==1.35.57"
}, },
"botocore": { "botocore": {
"hashes": [ "hashes": [
"sha256:92ddd02469213766872cb2399269dd20948f90348b42bf08379881d5e946cc34", "sha256:4a48c15b87c6a72719a6c2d8688f4e52fe52c18ac9dfcaa25c7e62c5df475ee2",
"sha256:d96306558085baf0bcb3b022d7a8c39c93494f031edb376694d2b2dcd0e81327" "sha256:c92f810b5faec5126f3faf7dc7d77346e407ab3b89bb613290c86ff2fd5405b9"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==1.35.57" "version": "==1.34.24"
}, },
"coverage": { "coverage": {
"extras": [ "extras": [
"toml" "toml"
], ],
"hashes": [ "hashes": [
"sha256:00a1d69c112ff5149cabe60d2e2ee948752c975d95f1e1096742e6077affd376", "sha256:04387a4a6ecb330c1878907ce0dc04078ea72a869263e53c72a1ba5bbdf380ca",
"sha256:023bf8ee3ec6d35af9c1c6ccc1d18fa69afa1cb29eaac57cb064dbb262a517f9", "sha256:0676cd0ba581e514b7f726495ea75aba3eb20899d824636c6f59b0ed2f88c471",
"sha256:0294ca37f1ba500667b1aef631e48d875ced93ad5e06fa665a3295bdd1d95111", "sha256:0e8d06778e8fbffccfe96331a3946237f87b1e1d359d7fbe8b06b96c95a5407a",
"sha256:06babbb8f4e74b063dbaeb74ad68dfce9186c595a15f11f5d5683f748fa1d172", "sha256:0eb3c2f32dabe3a4aaf6441dde94f35687224dfd7eb2a7f47f3fd9428e421058",
"sha256:0809082ee480bb8f7416507538243c8863ac74fd8a5d2485c46f0f7499f2b491", "sha256:109f5985182b6b81fe33323ab4707011875198c41964f014579cf82cebf2bb85",
"sha256:0b3fb02fe73bed561fa12d279a417b432e5b50fe03e8d663d61b3d5990f29546", "sha256:13eaf476ec3e883fe3e5fe3707caeb88268a06284484a3daf8250259ef1ba143",
"sha256:0b58c672d14f16ed92a48db984612f5ce3836ae7d72cdd161001cc54512571f2", "sha256:164fdcc3246c69a6526a59b744b62e303039a81e42cfbbdc171c91a8cc2f9446",
"sha256:0bcd1069e710600e8e4cf27f65c90c7843fa8edfb4520fb0ccb88894cad08b11", "sha256:26776ff6c711d9d835557ee453082025d871e30b3fd6c27fcef14733f67f0590",
"sha256:1032e178b76a4e2b5b32e19d0fd0abbce4b58e77a1ca695820d10e491fa32b08", "sha256:26f66da8695719ccf90e794ed567a1549bb2644a706b41e9f6eae6816b398c4a",
"sha256:11a223a14e91a4693d2d0755c7a043db43d96a7450b4f356d506c2562c48642c", "sha256:29f3abe810930311c0b5d1a7140f6395369c3db1be68345638c33eec07535105",
"sha256:12394842a3a8affa3ba62b0d4ab7e9e210c5e366fbac3e8b2a68636fb19892c2", "sha256:316543f71025a6565677d84bc4df2114e9b6a615aa39fb165d697dba06a54af9",
"sha256:182e6cd5c040cec0a1c8d415a87b67ed01193ed9ad458ee427741c7d8513d963", "sha256:36b0ea8ab20d6a7564e89cb6135920bc9188fb5f1f7152e94e8300b7b189441a",
"sha256:1d5b8007f81b88696d06f7df0cb9af0d3b835fe0c8dbf489bad70b45f0e45613", "sha256:3cc9d4bc55de8003663ec94c2f215d12d42ceea128da8f0f4036235a119c88ac",
"sha256:1f76846299ba5c54d12c91d776d9605ae33f8ae2b9d1d3c3703cf2db1a67f2c0", "sha256:485e9f897cf4856a65a57c7f6ea3dc0d4e6c076c87311d4bc003f82cfe199d25",
"sha256:27fb4a050aaf18772db513091c9c13f6cb94ed40eacdef8dad8411d92d9992db", "sha256:5040148f4ec43644702e7b16ca864c5314ccb8ee0751ef617d49aa0e2d6bf4f2",
"sha256:29155cd511ee058e260db648b6182c419422a0d2e9a4fa44501898cf918866cf", "sha256:51456e6fa099a8d9d91497202d9563a320513fcf59f33991b0661a4a6f2ad450",
"sha256:29fc0f17b1d3fea332f8001d4558f8214af7f1d87a345f3a133c901d60347c73", "sha256:53d7d9158ee03956e0eadac38dfa1ec8068431ef8058fe6447043db1fb40d932",
"sha256:2b6b4c83d8e8ea79f27ab80778c19bc037759aea298da4b56621f4474ffeb117", "sha256:5a10a4920def78bbfff4eff8a05c51be03e42f1c3735be42d851f199144897ba",
"sha256:2fdef0d83a2d08d69b1f2210a93c416d54e14d9eb398f6ab2f0a209433db19e1", "sha256:5b14b4f8760006bfdb6e08667af7bc2d8d9bfdb648351915315ea17645347137",
"sha256:3c65d37f3a9ebb703e710befdc489a38683a5b152242664b973a7b7b22348a4e", "sha256:5b2ccb7548a0b65974860a78c9ffe1173cfb5877460e5a229238d985565574ae",
"sha256:4f704f0998911abf728a7783799444fcbbe8261c4a6c166f667937ae6a8aa522", "sha256:697d1317e5290a313ef0d369650cfee1a114abb6021fa239ca12b4849ebbd614",
"sha256:51b44306032045b383a7a8a2c13878de375117946d68dcb54308111f39775a25", "sha256:6ae8c9d301207e6856865867d762a4b6fd379c714fcc0607a84b92ee63feff70",
"sha256:53d202fd109416ce011578f321460795abfe10bb901b883cafd9b3ef851bacfc", "sha256:707c0f58cb1712b8809ece32b68996ee1e609f71bd14615bd8f87a1293cb610e",
"sha256:58809e238a8a12a625c70450b48e8767cff9eb67c62e6154a642b21ddf79baea", "sha256:74775198b702868ec2d058cb92720a3c5a9177296f75bd97317c787daf711505",
"sha256:5915fcdec0e54ee229926868e9b08586376cae1f5faa9bbaf8faf3561b393d52", "sha256:756ded44f47f330666843b5781be126ab57bb57c22adbb07d83f6b519783b870",
"sha256:5beb1ee382ad32afe424097de57134175fea3faf847b9af002cc7895be4e2a5a", "sha256:76f03940f9973bfaee8cfba70ac991825611b9aac047e5c80d499a44079ec0bc",
"sha256:5f8ae553cba74085db385d489c7a792ad66f7f9ba2ee85bfa508aeb84cf0ba07", "sha256:79287fd95585ed36e83182794a57a46aeae0b64ca53929d1176db56aacc83451",
"sha256:5fbd612f8a091954a0c8dd4c0b571b973487277d26476f8480bfa4b2a65b5d06", "sha256:799c8f873794a08cdf216aa5d0531c6a3747793b70c53f70e98259720a6fe2d7",
"sha256:6bd818b7ea14bc6e1f06e241e8234508b21edf1b242d49831831a9450e2f35fa", "sha256:7d360587e64d006402b7116623cebf9d48893329ef035278969fa3bbf75b697e",
"sha256:6f01ba56b1c0e9d149f9ac85a2f999724895229eb36bd997b61e62999e9b0901", "sha256:80b5ee39b7f0131ebec7968baa9b2309eddb35b8403d1869e08f024efd883566",
"sha256:73d2b73584446e66ee633eaad1a56aad577c077f46c35ca3283cd687b7715b0b", "sha256:815ac2d0f3398a14286dc2cea223a6f338109f9ecf39a71160cd1628786bc6f5",
"sha256:7bb92c539a624cf86296dd0c68cd5cc286c9eef2d0c3b8b192b604ce9de20a17", "sha256:83c2dda2666fe32332f8e87481eed056c8b4d163fe18ecc690b02802d36a4d26",
"sha256:8165b796df0bd42e10527a3f493c592ba494f16ef3c8b531288e3d0d72c1f6f0", "sha256:846f52f46e212affb5bcf131c952fb4075b55aae6b61adc9856222df89cbe3e2",
"sha256:862264b12ebb65ad8d863d51f17758b1684560b66ab02770d4f0baf2ff75da21", "sha256:936d38794044b26c99d3dd004d8af0035ac535b92090f7f2bb5aa9c8e2f5cd42",
"sha256:8902dd6a30173d4ef09954bfcb24b5d7b5190cf14a43170e386979651e09ba19", "sha256:9864463c1c2f9cb3b5db2cf1ff475eed2f0b4285c2aaf4d357b69959941aa555",
"sha256:8cf717ee42012be8c0cb205dbbf18ffa9003c4cbf4ad078db47b95e10748eec5", "sha256:995ea5c48c4ebfd898eacb098164b3cc826ba273b3049e4a889658548e321b43",
"sha256:8ed9281d1b52628e81393f5eaee24a45cbd64965f41857559c2b7ff19385df51", "sha256:a1526d265743fb49363974b7aa8d5899ff64ee07df47dd8d3e37dcc0818f09ed",
"sha256:99b41d18e6b2a48ba949418db48159d7a2e81c5cc290fc934b7d2380515bd0e3", "sha256:a56de34db7b7ff77056a37aedded01b2b98b508227d2d0979d373a9b5d353daa",
"sha256:9cb7fa111d21a6b55cbf633039f7bc2749e74932e3aa7cb7333f675a58a58bf3", "sha256:a7c97726520f784239f6c62506bc70e48d01ae71e9da128259d61ca5e9788516",
"sha256:a181e99301a0ae128493a24cfe5cfb5b488c4e0bf2f8702091473d033494d04f", "sha256:b8e99f06160602bc64da35158bb76c73522a4010f0649be44a4e167ff8555952",
"sha256:a413a096c4cbac202433c850ee43fa326d2e871b24554da8327b01632673a076", "sha256:bb1de682da0b824411e00a0d4da5a784ec6496b6850fdf8c865c1d68c0e318dd",
"sha256:a6b1e54712ba3474f34b7ef7a41e65bd9037ad47916ccb1cc78769bae324c01a", "sha256:bf477c355274a72435ceb140dc42de0dc1e1e0bf6e97195be30487d8eaaf1a09",
"sha256:ade3ca1e5f0ff46b678b66201f7ff477e8fa11fb537f3b55c3f0568fbfe6e718", "sha256:bf635a52fc1ea401baf88843ae8708591aa4adff875e5c23220de43b1ccf575c",
"sha256:b0ac3d42cb51c4b12df9c5f0dd2f13a4f24f01943627120ec4d293c9181219ba", "sha256:bfd5db349d15c08311702611f3dccbef4b4e2ec148fcc636cf8739519b4a5c0f",
"sha256:b369ead6527d025a0fe7bd3864e46dbee3aa8f652d48df6174f8d0bac9e26e0e", "sha256:c530833afc4707fe48524a44844493f36d8727f04dcce91fb978c414a8556cc6",
"sha256:b57b768feb866f44eeed9f46975f3d6406380275c5ddfe22f531a2bf187eda27", "sha256:cc6d65b21c219ec2072c1293c505cf36e4e913a3f936d80028993dd73c7906b1",
"sha256:b8d3a03d9bfcaf5b0141d07a88456bb6a4c3ce55c080712fec8418ef3610230e", "sha256:cd3c1e4cb2ff0083758f09be0f77402e1bdf704adb7f89108007300a6da587d0",
"sha256:bc66f0bf1d7730a17430a50163bb264ba9ded56739112368ba985ddaa9c3bd09", "sha256:cfd2a8b6b0d8e66e944d47cdec2f47c48fef2ba2f2dff5a9a75757f64172857e",
"sha256:bf20494da9653f6410213424f5f8ad0ed885e01f7e8e59811f572bdb20b8972e", "sha256:d0ca5c71a5a1765a0f8f88022c52b6b8be740e512980362f7fdbb03725a0d6b9",
"sha256:c48167910a8f644671de9f2083a23630fbf7a1cb70ce939440cd3328e0919f70", "sha256:e7defbb9737274023e2d7af02cac77043c86ce88a907c58f42b580a97d5bcca9",
"sha256:c481b47f6b5845064c65a7bc78bc0860e635a9b055af0df46fdf1c58cebf8e8f", "sha256:e9d1bf53c4c8de58d22e0e956a79a5b37f754ed1ffdbf1a260d9dcfa2d8a325e",
"sha256:c7c8b95bf47db6d19096a5e052ffca0a05f335bc63cef281a6e8fe864d450a72", "sha256:ea81d8f9691bb53f4fb4db603203029643caffc82bf998ab5b59ca05560f4c06"
"sha256:c9b8e184898ed014884ca84c70562b4a82cbc63b044d366fedc68bc2b2f3394a",
"sha256:cc8ff50b50ce532de2fa7a7daae9dd12f0a699bfcd47f20945364e5c31799fef",
"sha256:d541423cdd416b78626b55f123412fcf979d22a2c39fce251b350de38c15c15b",
"sha256:dab4d16dfef34b185032580e2f2f89253d302facba093d5fa9dbe04f569c4f4b",
"sha256:dacbc52de979f2823a819571f2e3a350a7e36b8cb7484cdb1e289bceaf35305f",
"sha256:df57bdbeffe694e7842092c5e2e0bc80fff7f43379d465f932ef36f027179806",
"sha256:ed8fe9189d2beb6edc14d3ad19800626e1d9f2d975e436f84e19efb7fa19469b",
"sha256:f3ddf056d3ebcf6ce47bdaf56142af51bb7fad09e4af310241e9db7a3a8022e1",
"sha256:f8fe4984b431f8621ca53d9380901f62bfb54ff759a1348cd140490ada7b693c",
"sha256:fe439416eb6380de434886b00c859304338f8b19f6f54811984f3420a2e03858"
], ],
"markers": "python_version >= '3.9'", "markers": "python_version >= '3.8'",
"version": "==7.6.4" "version": "==7.4.0"
}, },
"exceptiongroup": { "exceptiongroup": {
"hashes": [ "hashes": [
"sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", "sha256:4bfd3996ac73b41e9b9628b04e079f193850720ea5945fc96a08633c66912f14",
"sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc" "sha256:91f5c769735f051a4290d52edd0858999b57e5876e9f85937691bd4c9fa3ed68"
], ],
"markers": "python_version < '3.11'", "markers": "python_version < '3.11'",
"version": "==1.2.2" "version": "==1.2.0"
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [
"sha256:049d058491e228e03e67b390f311bbf88fce2dbaa8fa673e7aea87b7198b8d38", "sha256:33f96621059e65eec474169085dc92bf26e7b2d47366b70be2f67ab80dc25132",
"sha256:597477df7860daa5aa0fdd84bf5208a043ab96b8e96ab708770ae0364dd03213" "sha256:a6dfbb75e03252917f2473ea9653f7cd799c3064e54d4c8140044c5c065f53c3"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_full_version >= '3.8.1'", "version": "==7.0.0"
"version": "==7.1.1"
}, },
"iniconfig": { "iniconfig": {
"hashes": [ "hashes": [
@ -305,27 +344,27 @@
}, },
"packaging": { "packaging": {
"hashes": [ "hashes": [
"sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5",
"sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f" "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.7'",
"version": "==24.2" "version": "==23.2"
}, },
"pluggy": { "pluggy": {
"hashes": [ "hashes": [
"sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", "sha256:cf61ae8f126ac6f7c451172cf30e3e43d3ca77615509771b3a984a0730651e12",
"sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669" "sha256:d89c696a773f8bd377d18e5ecda92b7a3793cbe66c87060a6fb58c7b6e1061f7"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==1.5.0" "version": "==1.3.0"
}, },
"pycodestyle": { "pycodestyle": {
"hashes": [ "hashes": [
"sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3", "sha256:41ba0e7afc9752dfb53ced5489e89f8186be00e599e712660695b7a75ff2663f",
"sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521" "sha256:44fe31000b2d866f2e41841b18528a505fbd7fef9017b04eff4e2648a0fadc67"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==2.12.1" "version": "==2.11.1"
}, },
"pyflakes": { "pyflakes": {
"hashes": [ "hashes": [
@ -337,37 +376,35 @@
}, },
"pytest": { "pytest": {
"hashes": [ "hashes": [
"sha256:70b98107bd648308a7952b06e6ca9a50bc660be218d53c257cc1fc94fda10181", "sha256:42ed2f917ded90ceb752dbe2ecb48c436c2a70d38bc16018c2d11da6426a18b6",
"sha256:a6853c7375b2663155079443d2e45de913a911a11d669df02a50814944db57b2" "sha256:efc82dc5e6f2f41ae5acb9eabdf2ced192f336664c436b24a7db2c6aaafe4efd"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.8'", "version": "==8.0.0rc2"
"version": "==8.3.3"
}, },
"pytest-cov": { "pytest-cov": {
"hashes": [ "hashes": [
"sha256:eee6f1b9e61008bd34975a4d5bab25801eb31898b032dd55addc93e96fcaaa35", "sha256:3904b13dfbfec47f003b8e77fd5b589cd11904a21ddf1ab38a64f204d6a10ef6",
"sha256:fde0b595ca248bb8e2d76f020b465f3b107c9632e6a1d1705f17834c89dcadc0" "sha256:6ba70b9e97e69fcc3fb45bfeab2d0a138fb65c4d0d6a41ef33983ad114be8c3a"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.9'", "version": "==4.1.0"
"version": "==6.0.0"
}, },
"python-dateutil": { "python-dateutil": {
"hashes": [ "hashes": [
"sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86",
"sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427" "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"
], ],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==2.9.0.post0" "version": "==2.8.2"
}, },
"s3transfer": { "s3transfer": {
"hashes": [ "hashes": [
"sha256:263ed587a5803c6c708d3ce44dc4dfedaab4c1a32e8329bab818933d79ddcf5d", "sha256:3cdb40f5cfa6966e812209d0994f2a4709b561c88e90cf00c2696d2df4e56b2e",
"sha256:4f50ed74ab84d474ce614475e0b8d5047ff080810aac5d01ea25231cfc944b0c" "sha256:d0c8bbf672d5eebbe4e57945e23b972d963f07d82f661cabf678a5c88831595b"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==0.10.3" "version": "==0.10.0"
}, },
"six": { "six": {
"hashes": [ "hashes": [
@ -379,19 +416,19 @@
}, },
"tomli": { "tomli": {
"hashes": [ "hashes": [
"sha256:2ebe24485c53d303f690b0ec092806a085f07af5a5aa1464f3931eec36caaa38", "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc",
"sha256:d46d457a85337051c36524bc5349dd91b1877838e2979ac5ced3e710ed8a60ed" "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"
], ],
"markers": "python_version < '3.11'", "markers": "python_version < '3.11'",
"version": "==2.0.2" "version": "==2.0.1"
}, },
"urllib3": { "urllib3": {
"hashes": [ "hashes": [
"sha256:0ed14ccfbf1c30a9072c7ca157e4319b70d65f623e91e7b32fadb2853431016e", "sha256:34b97092d7e0a3a8cf7cd10e386f401b3737364026c45e622aa02903dffe0f07",
"sha256:40c2dc0c681e47eb8f90e7e27bf6ff7df2e677421fd46756da1161c39ca70d32" "sha256:f8ecc1bba5667413457c529ab955bf8c67b45db799d159066261719e328580a0"
], ],
"markers": "python_version < '3.10'", "markers": "python_version < '3.10'",
"version": "==1.26.20" "version": "==1.26.18"
} }
} }
} }

View File

@ -20,8 +20,8 @@ def exec():
db = Database.get_instance() db = Database.get_instance()
try: try:
db.connect() db.connect()
db.begin()
db.to_jst() db.to_jst()
db.begin()
logger.debug('DCF施設統合マスタ作成処理開始') logger.debug('DCF施設統合マスタ作成処理開始')
# COM施設からDCF施設統合マスタに登録 # COM施設からDCF施設統合マスタに登録
(is_add_dcf_inst_merge, duplication_inst_records) = _insert_dcf_inst_merge_from_com_inst(db) (is_add_dcf_inst_merge, duplication_inst_records) = _insert_dcf_inst_merge_from_com_inst(db)

View File

@ -1,5 +1,4 @@
from datetime import datetime, timedelta from datetime import datetime, timedelta
from src.batch.batch_functions import logging_sql from src.batch.batch_functions import logging_sql
from src.batch.common.batch_context import BatchContext from src.batch.common.batch_context import BatchContext
from src.db.database import Database from src.db.database import Database
@ -15,8 +14,8 @@ def exec():
db = Database.get_instance() db = Database.get_instance()
try: try:
db.connect() db.connect()
db.begin()
db.to_jst() db.to_jst()
db.begin()
logger.debug('DCF施設統合マスタ日次更新処理開始') logger.debug('DCF施設統合マスタ日次更新処理開始')
# DCF施設統合マスタ移行先コードのセット(無効フラグが『0(有効)』) # DCF施設統合マスタ移行先コードのセット(無効フラグが『0(有効)』)
enabled_dst_inst_merge_records = _set_enabled_dct_inst_merge(db) enabled_dst_inst_merge_records = _set_enabled_dct_inst_merge(db)
@ -121,13 +120,12 @@ def _add_ult_ident_presc(db: Database, enabled_dst_inst_merge_records: list[dict
logger.info('納入先処方元マスタの登録 終了') logger.info('納入先処方元マスタの登録 終了')
def _select_primary_key_from_emp_chg_inst(db: Database, dcf_inst_cd: str) -> list[dict]: def _select_emp_chg_inst_ta_cd(db: Database, dcf_inst_cd: str) -> list[dict]:
# 従業員担当施設マスタから、DCF施設コードに対応した領域コードと担当者種別コードの取得 # 従業員担当施設マスタから、DCF施設コードに対応した領域コードの取得
try: try:
sql = """ sql = """
SELECT SELECT
ta_cd, ta_cd
emp_chg_type_cd
FROM FROM
src05.emp_chg_inst src05.emp_chg_inst
WHERE WHERE
@ -136,14 +134,14 @@ def _select_primary_key_from_emp_chg_inst(db: Database, dcf_inst_cd: str) -> lis
AND (SELECT ht.syor_date FROM src05.hdke_tbl AS ht) < end_date AND (SELECT ht.syor_date FROM src05.hdke_tbl AS ht) < end_date
""" """
params = {'dcf_inst_cd': dcf_inst_cd} params = {'dcf_inst_cd': dcf_inst_cd}
emp_chg_inst_primary_key_records = db.execute_select(sql, params) emp_chg_inst_ta_cd_records = db.execute_select(sql, params)
logging_sql(logger, sql) logging_sql(logger, sql)
logger.info('従業員担当施設マスタから領域コード、担当者種別コードの取得に成功') logger.info('従業員担当施設マスタから領域コードの取得に成功')
except Exception as e: except Exception as e:
logger.debug('従業員担当施設マスタから領域コード、担当者種別コードの取得に失敗') logger.debug('従業員担当施設マスタから領域コードの取得に失敗')
raise e raise e
return emp_chg_inst_primary_key_records return emp_chg_inst_ta_cd_records
def _add_emp_chg_inst(db: Database, enabled_dst_inst_merge_records: list[dict]): def _add_emp_chg_inst(db: Database, enabled_dst_inst_merge_records: list[dict]):
@ -151,10 +149,10 @@ def _add_emp_chg_inst(db: Database, enabled_dst_inst_merge_records: list[dict]):
logger.info('従業員担当施設マスタの登録 開始') logger.info('従業員担当施設マスタの登録 開始')
for enabled_merge_record in enabled_dst_inst_merge_records: for enabled_merge_record in enabled_dst_inst_merge_records:
tekiyo_month_first_day = _get_first_day_of_month(enabled_merge_record['tekiyo_month']) tekiyo_month_first_day = _get_first_day_of_month(enabled_merge_record['tekiyo_month'])
emp_chg_inst_primary_key_records = _select_primary_key_from_emp_chg_inst(db, enabled_merge_record['dcf_inst_cd']) emp_chg_inst_ta_cd_records = _select_emp_chg_inst_ta_cd(db, enabled_merge_record['dcf_inst_cd'])
for emp_chg_inst_primary_key_record in emp_chg_inst_primary_key_records: for emp_chg_inst_ta_cd_record in emp_chg_inst_ta_cd_records:
emp_chg_inst_records = _select_emp_chg_inst(db, enabled_merge_record['dcf_inst_cd'], enabled_merge_record['dup_opp_cd'], emp_chg_inst_records = _select_emp_chg_inst(db, enabled_merge_record['dcf_inst_cd'], enabled_merge_record['dup_opp_cd'],
emp_chg_inst_primary_key_record['ta_cd'], emp_chg_inst_primary_key_record['emp_chg_type_cd']) emp_chg_inst_ta_cd_record['ta_cd'])
for emp_chg_inst_row in emp_chg_inst_records: for emp_chg_inst_row in emp_chg_inst_records:
# 重複時相手先コードが存在したかのチェック # 重複時相手先コードが存在したかのチェック
if emp_chg_inst_row['opp_count'] > 0: if emp_chg_inst_row['opp_count'] > 0:
@ -175,7 +173,7 @@ def _add_emp_chg_inst(db: Database, enabled_dst_inst_merge_records: list[dict]):
emp_chg_inst_row) emp_chg_inst_row)
continue continue
# 適用開始日 ≧ DCF施設統合マスタの適用月度の1日の場合、N(論理削除レコード)に設定する # 適用開始日 ≧ DCF施設統合マスタの適用月度の1日の場合、N(論理削除レコード)に設定する
_update_emp_chg_inst_disabled(db, enabled_merge_record['dcf_inst_cd'], emp_chg_inst_row['ta_cd'], emp_chg_inst_row['emp_chg_type_cd'], _update_emp_chg_inst_disabled(db, enabled_merge_record['dcf_inst_cd'], emp_chg_inst_row['ta_cd'],
emp_chg_inst_row['start_date']) emp_chg_inst_row['start_date'])
logger.info('従業員担当施設マスタの登録 終了') logger.info('従業員担当施設マスタの登録 終了')
@ -209,7 +207,7 @@ def _delete_ult_ident_presc(db: Database, start_date: str, ult_ident_presc_row:
raise e raise e
def _update_emp_chg_inst_disabled(db: Database, dcf_inst_cd: str, ta_cd: str, emp_chg_type_cd: str, start_date: str): def _update_emp_chg_inst_disabled(db: Database, dcf_inst_cd: str, ta_cd: str, start_date: str):
# emp_chg_instをUPDATE # emp_chg_instをUPDATE
try: try:
elapsed_time = ElapsedTime() elapsed_time = ElapsedTime()
@ -223,10 +221,9 @@ def _update_emp_chg_inst_disabled(db: Database, dcf_inst_cd: str, ta_cd: str, em
WHERE WHERE
inst_cd = :dcf_inst_cd inst_cd = :dcf_inst_cd
AND ta_cd = :ta_cd AND ta_cd = :ta_cd
AND emp_chg_type_cd = :emp_chg_type_cd
AND start_date = :start_date AND start_date = :start_date
""" """
params = {'dcf_inst_cd': dcf_inst_cd, 'ta_cd': ta_cd, 'emp_chg_type_cd': emp_chg_type_cd, 'start_date': start_date} params = {'dcf_inst_cd': dcf_inst_cd, 'ta_cd': ta_cd, 'start_date': start_date}
res = db.execute(sql, params) res = db.execute(sql, params)
logging_sql(logger, sql) logging_sql(logger, sql)
logger.info(f'従業員担当施設マスタのYorNフラグ更新に成功, {res.rowcount} 行更新 ({elapsed_time.of})') logger.info(f'従業員担当施設マスタのYorNフラグ更新に成功, {res.rowcount} 行更新 ({elapsed_time.of})')
@ -249,7 +246,6 @@ def _update_emp_chg_inst_end_date(db: Database, dcf_inst_cd: str, last_end_date:
WHERE WHERE
inst_cd = :dcf_inst_cd inst_cd = :dcf_inst_cd
AND ta_cd = :ta_cd AND ta_cd = :ta_cd
AND emp_chg_type_cd = :emp_chg_type_cd
AND emp_cd = :emp_cd AND emp_cd = :emp_cd
AND bu_cd = :bu_cd AND bu_cd = :bu_cd
AND start_date = :start_date AND start_date = :start_date
@ -258,7 +254,6 @@ def _update_emp_chg_inst_end_date(db: Database, dcf_inst_cd: str, last_end_date:
'end_date': last_end_date, 'end_date': last_end_date,
'dcf_inst_cd': dcf_inst_cd, 'dcf_inst_cd': dcf_inst_cd,
'ta_cd': emp_chg_inst_row['ta_cd'], 'ta_cd': emp_chg_inst_row['ta_cd'],
'emp_chg_type_cd': emp_chg_inst_row['emp_chg_type_cd'],
'emp_cd': emp_chg_inst_row['emp_cd'], 'emp_cd': emp_chg_inst_row['emp_cd'],
'bu_cd': emp_chg_inst_row['bu_cd'], 'bu_cd': emp_chg_inst_row['bu_cd'],
'start_date': emp_chg_inst_row['start_date'] 'start_date': emp_chg_inst_row['start_date']
@ -281,7 +276,6 @@ def _insert_emp_chg_inst(db: Database, dup_opp_cd: str, set_start_date: str,
src05.emp_chg_inst( src05.emp_chg_inst(
inst_cd, inst_cd,
ta_cd, ta_cd,
emp_chg_type_cd,
emp_cd, emp_cd,
bu_cd, bu_cd,
start_date, start_date,
@ -296,7 +290,6 @@ def _insert_emp_chg_inst(db: Database, dup_opp_cd: str, set_start_date: str,
VALUES( VALUES(
:dup_opp_cd, :dup_opp_cd,
:ta_cd, :ta_cd,
:emp_chg_type_cd,
:emp_cd, :emp_cd,
:bu_cd, :bu_cd,
:start_date, :start_date,
@ -312,7 +305,6 @@ def _insert_emp_chg_inst(db: Database, dup_opp_cd: str, set_start_date: str,
params = { params = {
'dup_opp_cd': dup_opp_cd, 'dup_opp_cd': dup_opp_cd,
'ta_cd': emp_chg_inst_row['ta_cd'], 'ta_cd': emp_chg_inst_row['ta_cd'],
'emp_chg_type_cd': emp_chg_inst_row['emp_chg_type_cd'],
'emp_cd': emp_chg_inst_row['emp_cd'], 'emp_cd': emp_chg_inst_row['emp_cd'],
'bu_cd': emp_chg_inst_row['bu_cd'], 'bu_cd': emp_chg_inst_row['bu_cd'],
'start_date': set_start_date, 'start_date': set_start_date,
@ -526,14 +518,13 @@ def _insert_ult_ident_presc(db: Database, set_Start_Date: str, dup_opp_cd: str,
raise e raise e
def _select_emp_chg_inst(db: Database, dcf_inst_cd: str, dup_opp_cd: str, ta_cd: str, emp_chg_type_cd: str) -> list[dict]: def _select_emp_chg_inst(db: Database, dcf_inst_cd: str, dup_opp_cd: str, ta_cd: str) -> list[dict]:
# emp_chg_instからSELECT # emp_chg_instからSELECT
try: try:
sql = """ sql = """
SELECT SELECT
eci.inst_cd, eci.inst_cd,
eci.ta_cd, eci.ta_cd,
eci.emp_chg_type_cd,
eci.emp_cd, eci.emp_cd,
eci.bu_cd, eci.bu_cd,
eci.start_date, eci.start_date,
@ -548,18 +539,16 @@ def _select_emp_chg_inst(db: Database, dcf_inst_cd: str, dup_opp_cd: str, ta_cd:
WHERE WHERE
eciopp.inst_cd = :dup_opp_cd eciopp.inst_cd = :dup_opp_cd
AND eciopp.ta_cd = :ta_cd AND eciopp.ta_cd = :ta_cd
AND eciopp.emp_chg_type_cd = :emp_chg_type_cd
) AS opp_count ) AS opp_count
FROM FROM
src05.emp_chg_inst AS eci src05.emp_chg_inst AS eci
WHERE WHERE
eci.inst_cd = :dcf_inst_cd eci.inst_cd = :dcf_inst_cd
AND eci.ta_cd = :ta_cd AND eci.ta_cd = :ta_cd
AND eci.emp_chg_type_cd = :emp_chg_type_cd
AND eci.enabled_flg = 'Y' AND eci.enabled_flg = 'Y'
AND (SELECT ht.syor_date FROM src05.hdke_tbl AS ht) < eci.end_date AND (SELECT ht.syor_date FROM src05.hdke_tbl AS ht) < eci.end_date
""" """
params = {'dcf_inst_cd': dcf_inst_cd, 'dup_opp_cd': dup_opp_cd, 'ta_cd': ta_cd, 'emp_chg_type_cd': emp_chg_type_cd} params = {'dcf_inst_cd': dcf_inst_cd, 'dup_opp_cd': dup_opp_cd, 'ta_cd': ta_cd}
emp_chg_inst_records = db.execute_select(sql, params) emp_chg_inst_records = db.execute_select(sql, params)
logging_sql(logger, sql) logging_sql(logger, sql)
logger.info('従業員担当施設マスタの取得 成功') logger.info('従業員担当施設マスタの取得 成功')

View File

@ -53,9 +53,7 @@ def _insert_into_emp_chg_inst_lau_from_emp_chg_inst(db: Database):
src05.emp_chg_inst_lau src05.emp_chg_inst_lau
SELECT SELECT
inst_cd, inst_cd,
ta_cd, ta_cd,emp_cd,
emp_chg_type_cd,
emp_cd,
bu_cd, bu_cd,
start_date, start_date,
end_date, end_date,

View File

@ -86,14 +86,7 @@ def _insert_mst_inst_from_fcl_mst_v(db: Database):
END AS address, END AS address,
fmv1.postal_cd, fmv1.postal_cd,
fmv1.tel_num, fmv1.tel_num,
CASE LEFT(fmv1.closed_dt, 10),
WHEN
fmv1.fcl_type BETWEEN '20' AND '29' THEN LEFT(fmv1.closed_dt, 10)
WHEN
fmv1.fcl_type IN ('A1', 'A0') AND fmv1.end_date != '9999-12-31' THEN DATE_FORMAT(fmv1.end_date, "%Y-%m-%d")
ELSE
null
END AS delete_date,
fmv1.v_inst_cd, fmv1.v_inst_cd,
fmv1.ins_dt, fmv1.ins_dt,
fmv1.upd_dt fmv1.upd_dt

View File

@ -1,9 +1,8 @@
from src.batch.common.batch_context import BatchContext from src.batch.common.batch_context import BatchContext
from src.batch.laundering import (
create_inst_merge_for_laundering, emp_chg_inst_laundering,
ult_ident_presc_laundering, sales_results_laundering)
from src.batch.dcf_inst_merge import integrate_dcf_inst_merge from src.batch.dcf_inst_merge import integrate_dcf_inst_merge
from src.batch.laundering import (create_inst_merge_for_laundering,
emp_chg_inst_laundering,
sales_results_laundering,
ult_ident_presc_laundering)
from src.logging.get_logger import get_logger from src.logging.get_logger import get_logger
batch_context = BatchContext.get_instance() batch_context = BatchContext.get_instance()

View File

@ -16,115 +16,166 @@
] ]
}, },
"default": { "default": {
"greenlet": {
"hashes": [
"sha256:01bc7ea167cf943b4c802068e178bbf70ae2e8c080467070d01bfa02f337ee67",
"sha256:0448abc479fab28b00cb472d278828b3ccca164531daab4e970a0458786055d6",
"sha256:086152f8fbc5955df88382e8a75984e2bb1c892ad2e3c80a2508954e52295257",
"sha256:098d86f528c855ead3479afe84b49242e174ed262456c342d70fc7f972bc13c4",
"sha256:149e94a2dd82d19838fe4b2259f1b6b9957d5ba1b25640d2380bea9c5df37676",
"sha256:1551a8195c0d4a68fac7a4325efac0d541b48def35feb49d803674ac32582f61",
"sha256:15d79dd26056573940fcb8c7413d84118086f2ec1a8acdfa854631084393efcc",
"sha256:1996cb9306c8595335bb157d133daf5cf9f693ef413e7673cb07e3e5871379ca",
"sha256:1a7191e42732df52cb5f39d3527217e7ab73cae2cb3694d241e18f53d84ea9a7",
"sha256:1ea188d4f49089fc6fb283845ab18a2518d279c7cd9da1065d7a84e991748728",
"sha256:1f672519db1796ca0d8753f9e78ec02355e862d0998193038c7073045899f305",
"sha256:2516a9957eed41dd8f1ec0c604f1cdc86758b587d964668b5b196a9db5bfcde6",
"sha256:2797aa5aedac23af156bbb5a6aa2cd3427ada2972c828244eb7d1b9255846379",
"sha256:2dd6e660effd852586b6a8478a1d244b8dc90ab5b1321751d2ea15deb49ed414",
"sha256:3ddc0f794e6ad661e321caa8d2f0a55ce01213c74722587256fb6566049a8b04",
"sha256:3ed7fb269f15dc662787f4119ec300ad0702fa1b19d2135a37c2c4de6fadfd4a",
"sha256:419b386f84949bf0e7c73e6032e3457b82a787c1ab4a0e43732898a761cc9dbf",
"sha256:43374442353259554ce33599da8b692d5aa96f8976d567d4badf263371fbe491",
"sha256:52f59dd9c96ad2fc0d5724107444f76eb20aaccb675bf825df6435acb7703559",
"sha256:57e8974f23e47dac22b83436bdcf23080ade568ce77df33159e019d161ce1d1e",
"sha256:5b51e85cb5ceda94e79d019ed36b35386e8c37d22f07d6a751cb659b180d5274",
"sha256:649dde7de1a5eceb258f9cb00bdf50e978c9db1b996964cd80703614c86495eb",
"sha256:64d7675ad83578e3fc149b617a444fab8efdafc9385471f868eb5ff83e446b8b",
"sha256:68834da854554926fbedd38c76e60c4a2e3198c6fbed520b106a8986445caaf9",
"sha256:6b66c9c1e7ccabad3a7d037b2bcb740122a7b17a53734b7d72a344ce39882a1b",
"sha256:70fb482fdf2c707765ab5f0b6655e9cfcf3780d8d87355a063547b41177599be",
"sha256:7170375bcc99f1a2fbd9c306f5be8764eaf3ac6b5cb968862cad4c7057756506",
"sha256:73a411ef564e0e097dbe7e866bb2dda0f027e072b04da387282b02c308807405",
"sha256:77457465d89b8263bca14759d7c1684df840b6811b2499838cc5b040a8b5b113",
"sha256:7f362975f2d179f9e26928c5b517524e89dd48530a0202570d55ad6ca5d8a56f",
"sha256:81bb9c6d52e8321f09c3d165b2a78c680506d9af285bfccbad9fb7ad5a5da3e5",
"sha256:881b7db1ebff4ba09aaaeae6aa491daeb226c8150fc20e836ad00041bcb11230",
"sha256:894393ce10ceac937e56ec00bb71c4c2f8209ad516e96033e4b3b1de270e200d",
"sha256:99bf650dc5d69546e076f413a87481ee1d2d09aaaaaca058c9251b6d8c14783f",
"sha256:9da2bd29ed9e4f15955dd1595ad7bc9320308a3b766ef7f837e23ad4b4aac31a",
"sha256:afaff6cf5200befd5cec055b07d1c0a5a06c040fe5ad148abcd11ba6ab9b114e",
"sha256:b1b5667cced97081bf57b8fa1d6bfca67814b0afd38208d52538316e9422fc61",
"sha256:b37eef18ea55f2ffd8f00ff8fe7c8d3818abd3e25fb73fae2ca3b672e333a7a6",
"sha256:b542be2440edc2d48547b5923c408cbe0fc94afb9f18741faa6ae970dbcb9b6d",
"sha256:b7dcbe92cc99f08c8dd11f930de4d99ef756c3591a5377d1d9cd7dd5e896da71",
"sha256:b7f009caad047246ed379e1c4dbcb8b020f0a390667ea74d2387be2998f58a22",
"sha256:bba5387a6975598857d86de9eac14210a49d554a77eb8261cc68b7d082f78ce2",
"sha256:c5e1536de2aad7bf62e27baf79225d0d64360d4168cf2e6becb91baf1ed074f3",
"sha256:c5ee858cfe08f34712f548c3c363e807e7186f03ad7a5039ebadb29e8c6be067",
"sha256:c9db1c18f0eaad2f804728c67d6c610778456e3e1cc4ab4bbd5eeb8e6053c6fc",
"sha256:d353cadd6083fdb056bb46ed07e4340b0869c305c8ca54ef9da3421acbdf6881",
"sha256:d46677c85c5ba00a9cb6f7a00b2bfa6f812192d2c9f7d9c4f6a55b60216712f3",
"sha256:d4d1ac74f5c0c0524e4a24335350edad7e5f03b9532da7ea4d3c54d527784f2e",
"sha256:d73a9fe764d77f87f8ec26a0c85144d6a951a6c438dfe50487df5595c6373eac",
"sha256:da70d4d51c8b306bb7a031d5cff6cc25ad253affe89b70352af5f1cb68e74b53",
"sha256:daf3cb43b7cf2ba96d614252ce1684c1bccee6b2183a01328c98d36fcd7d5cb0",
"sha256:dca1e2f3ca00b84a396bc1bce13dd21f680f035314d2379c4160c98153b2059b",
"sha256:dd4f49ae60e10adbc94b45c0b5e6a179acc1736cf7a90160b404076ee283cf83",
"sha256:e1f145462f1fa6e4a4ae3c0f782e580ce44d57c8f2c7aae1b6fa88c0b2efdb41",
"sha256:e3391d1e16e2a5a1507d83e4a8b100f4ee626e8eca43cf2cadb543de69827c4c",
"sha256:fcd2469d6a2cf298f198f0487e0a5b1a47a42ca0fa4dfd1b6862c999f018ebbf",
"sha256:fd096eb7ffef17c456cfa587523c5f92321ae02427ff955bebe9e3c63bc9f0da",
"sha256:fe754d231288e1e64323cfad462fcee8f0288654c10bdf4f603a39ed923bef33"
],
"markers": "platform_machine == 'aarch64' or (platform_machine == 'ppc64le' or (platform_machine == 'x86_64' or (platform_machine == 'amd64' or (platform_machine == 'AMD64' or (platform_machine == 'win32' or platform_machine == 'WIN32')))))",
"version": "==3.0.3"
},
"pymysql": { "pymysql": {
"hashes": [ "hashes": [
"sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c", "sha256:4f13a7df8bf36a51e81dd9f3605fede45a4878fe02f9236349fd82a3f0612f96",
"sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0" "sha256:8969ec6d763c856f7073c4c64662882675702efcb114b4bcbb955aea3a069fa7"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.7'", "version": "==1.1.0"
"version": "==1.1.1"
}, },
"sqlalchemy": { "sqlalchemy": {
"hashes": [ "hashes": [
"sha256:023b3ee6169969beea3bb72312e44d8b7c27c75b347942d943cf49397b7edeb5", "sha256:0d3cab3076af2e4aa5693f89622bef7fa770c6fec967143e4da7508b3dceb9b9",
"sha256:03968a349db483936c249f4d9cd14ff2c296adfa1290b660ba6516f973139582", "sha256:0dacf67aee53b16f365c589ce72e766efaabd2b145f9de7c917777b575e3659d",
"sha256:05132c906066142103b83d9c250b60508af556982a385d96c4eaa9fb9720ac2b", "sha256:10331f129982a19df4284ceac6fe87353ca3ca6b4ca77ff7d697209ae0a5915e",
"sha256:087b6b52de812741c27231b5a3586384d60c353fbd0e2f81405a814b5591dc8b", "sha256:14a6f68e8fc96e5e8f5647ef6cda6250c780612a573d99e4d881581432ef1669",
"sha256:0b3dbf1e7e9bc95f4bac5e2fb6d3fb2f083254c3fdd20a1789af965caf2d2348", "sha256:1b1180cda6df7af84fe72e4530f192231b1f29a7496951db4ff38dac1687202d",
"sha256:118c16cd3f1b00c76d69343e38602006c9cfb9998fa4f798606d28d63f23beda", "sha256:29049e2c299b5ace92cbed0c1610a7a236f3baf4c6b66eb9547c01179f638ec5",
"sha256:1936af879e3db023601196a1684d28e12f19ccf93af01bf3280a3262c4b6b4e5", "sha256:342d365988ba88ada8af320d43df4e0b13a694dbd75951f537b2d5e4cb5cd002",
"sha256:1e3f196a0c59b0cae9a0cd332eb1a4bda4696e863f4f1cf84ab0347992c548c2", "sha256:420362338681eec03f53467804541a854617faed7272fe71a1bfdb07336a381e",
"sha256:23a8825495d8b195c4aa9ff1c430c28f2c821e8c5e2d98089228af887e5d7e29", "sha256:4344d059265cc8b1b1be351bfb88749294b87a8b2bbe21dfbe066c4199541ebd",
"sha256:293cd444d82b18da48c9f71cd7005844dbbd06ca19be1ccf6779154439eec0b8", "sha256:4f7a7d7fcc675d3d85fbf3b3828ecd5990b8d61bd6de3f1b260080b3beccf215",
"sha256:32f9dc8c44acdee06c8fc6440db9eae8b4af8b01e4b1aee7bdd7241c22edff4f", "sha256:555651adbb503ac7f4cb35834c5e4ae0819aab2cd24857a123370764dc7d7e24",
"sha256:34ea30ab3ec98355235972dadc497bb659cc75f8292b760394824fab9cf39826", "sha256:59a21853f5daeb50412d459cfb13cb82c089ad4c04ec208cd14dddd99fc23b39",
"sha256:3d3549fc3e40667ec7199033a4e40a2f669898a00a7b18a931d3efb4c7900504", "sha256:5fdd402169aa00df3142149940b3bf9ce7dde075928c1886d9a1df63d4b8de62",
"sha256:41836fe661cc98abfae476e14ba1906220f92c4e528771a8a3ae6a151242d2ae", "sha256:605b6b059f4b57b277f75ace81cc5bc6335efcbcc4ccb9066695e515dbdb3900",
"sha256:4d44522480e0bf34c3d63167b8cfa7289c1c54264c2950cc5fc26e7850967e45", "sha256:665f0a3954635b5b777a55111ababf44b4fc12b1f3ba0a435b602b6387ffd7cf",
"sha256:4eeb195cdedaf17aab6b247894ff2734dcead6c08f748e617bfe05bd5a218443", "sha256:6f9e2e59cbcc6ba1488404aad43de005d05ca56e069477b33ff74e91b6319735",
"sha256:4f67766965996e63bb46cfbf2ce5355fc32d9dd3b8ad7e536a920ff9ee422e23", "sha256:736ea78cd06de6c21ecba7416499e7236a22374561493b456a1f7ffbe3f6cdb4",
"sha256:57df5dc6fdb5ed1a88a1ed2195fd31927e705cad62dedd86b46972752a80f576", "sha256:74b080c897563f81062b74e44f5a72fa44c2b373741a9ade701d5f789a10ba23",
"sha256:598d9ebc1e796431bbd068e41e4de4dc34312b7aa3292571bb3674a0cb415dd1", "sha256:75432b5b14dc2fff43c50435e248b45c7cdadef73388e5610852b95280ffd0e9",
"sha256:5b14e97886199c1f52c14629c11d90c11fbb09e9334fa7bb5f6d068d9ced0ce0", "sha256:75f99202324383d613ddd1f7455ac908dca9c2dd729ec8584c9541dd41822a2c",
"sha256:5e22575d169529ac3e0a120cf050ec9daa94b6a9597993d1702884f6954a7d71", "sha256:790f533fa5c8901a62b6fef5811d48980adeb2f51f1290ade8b5e7ba990ba3de",
"sha256:60c578c45c949f909a4026b7807044e7e564adf793537fc762b2489d522f3d11", "sha256:798f717ae7c806d67145f6ae94dc7c342d3222d3b9a311a784f371a4333212c7",
"sha256:6145afea51ff0af7f2564a05fa95eb46f542919e6523729663a5d285ecb3cf5e", "sha256:7c88f0c7dcc5f99bdb34b4fd9b69b93c89f893f454f40219fe923a3a2fd11625",
"sha256:6375cd674fe82d7aa9816d1cb96ec592bac1726c11e0cafbf40eeee9a4516b5f", "sha256:7d505815ac340568fd03f719446a589162d55c52f08abd77ba8964fbb7eb5b5f",
"sha256:6854175807af57bdb6425e47adbce7d20a4d79bbfd6f6d6519cd10bb7109a7f8", "sha256:84daa0a2055df9ca0f148a64fdde12ac635e30edbca80e87df9b3aaf419e144a",
"sha256:6ab60a5089a8f02009f127806f777fca82581c49e127f08413a66056bd9166dd", "sha256:87d91043ea0dc65ee583026cb18e1b458d8ec5fc0a93637126b5fc0bc3ea68c4",
"sha256:725875a63abf7c399d4548e686debb65cdc2549e1825437096a0af1f7e374814", "sha256:87f6e732bccd7dcf1741c00f1ecf33797383128bd1c90144ac8adc02cbb98643",
"sha256:7492967c3386df69f80cf67efd665c0f667cee67032090fe01d7d74b0e19bb08", "sha256:884272dcd3ad97f47702965a0e902b540541890f468d24bd1d98bcfe41c3f018",
"sha256:81965cc20848ab06583506ef54e37cf15c83c7e619df2ad16807c03100745dea", "sha256:8b8cb63d3ea63b29074dcd29da4dc6a97ad1349151f2d2949495418fd6e48db9",
"sha256:81c24e0c0fde47a9723c81d5806569cddef103aebbf79dbc9fcbb617153dea30", "sha256:91f7d9d1c4dd1f4f6e092874c128c11165eafcf7c963128f79e28f8445de82d5",
"sha256:81eedafa609917040d39aa9332e25881a8e7a0862495fcdf2023a9667209deda", "sha256:a2c69a7664fb2d54b8682dd774c3b54f67f84fa123cf84dda2a5f40dcaa04e08",
"sha256:81f413674d85cfd0dfcd6512e10e0f33c19c21860342a4890c3a2b59479929f9", "sha256:a3be4987e3ee9d9a380b66393b77a4cd6d742480c951a1c56a23c335caca4ce3",
"sha256:8280856dd7c6a68ab3a164b4a4b1c51f7691f6d04af4d4ca23d6ecf2261b7923", "sha256:a86b4240e67d4753dc3092d9511886795b3c2852abe599cffe108952f7af7ac3",
"sha256:82ca366a844eb551daff9d2e6e7a9e5e76d2612c8564f58db6c19a726869c1df", "sha256:aa9373708763ef46782d10e950b49d0235bfe58facebd76917d3f5cbf5971aed",
"sha256:8b4af17bda11e907c51d10686eda89049f9ce5669b08fbe71a29747f1e876036", "sha256:b64b183d610b424a160b0d4d880995e935208fc043d0302dd29fee32d1ee3f95",
"sha256:90144d3b0c8b139408da50196c5cad2a6909b51b23df1f0538411cd23ffa45d3", "sha256:b801154027107461ee992ff4b5c09aa7cc6ec91ddfe50d02bca344918c3265c6",
"sha256:906e6b0d7d452e9a98e5ab8507c0da791856b2380fdee61b765632bb8698026f", "sha256:bb209a73b8307f8fe4fe46f6ad5979649be01607f11af1eb94aa9e8a3aaf77f0",
"sha256:90c11ceb9a1f482c752a71f203a81858625d8df5746d787a4786bca4ffdf71c6", "sha256:bc8b7dabe8e67c4832891a5d322cec6d44ef02f432b4588390017f5cec186a84",
"sha256:911cc493ebd60de5f285bcae0491a60b4f2a9f0f5c270edd1c4dbaef7a38fc04", "sha256:c51db269513917394faec5e5c00d6f83829742ba62e2ac4fa5c98d58be91662f",
"sha256:9a420a91913092d1e20c86a2f5f1fc85c1a8924dbcaf5e0586df8aceb09c9cc2", "sha256:c55731c116806836a5d678a70c84cb13f2cedba920212ba7dcad53260997666d",
"sha256:9f8c9fdd15a55d9465e590a402f42082705d66b05afc3ffd2d2eb3c6ba919560", "sha256:cf18ff7fc9941b8fc23437cc3e68ed4ebeff3599eec6ef5eebf305f3d2e9a7c2",
"sha256:a104c5694dfd2d864a6f91b0956eb5d5883234119cb40010115fd45a16da5e70", "sha256:d24f571990c05f6b36a396218f251f3e0dda916e0c687ef6fdca5072743208f5",
"sha256:a373a400f3e9bac95ba2a06372c4fd1412a7cee53c37fc6c05f829bf672b8769", "sha256:db854730a25db7c956423bb9fb4bdd1216c839a689bf9cc15fada0a7fb2f4570",
"sha256:a62448526dd9ed3e3beedc93df9bb6b55a436ed1474db31a2af13b313a70a7e1", "sha256:dc55990143cbd853a5d038c05e79284baedf3e299661389654551bd02a6a68d7",
"sha256:a8808d5cf866c781150d36a3c8eb3adccfa41a8105d031bf27e92c251e3969d6", "sha256:e607cdd99cbf9bb80391f54446b86e16eea6ad309361942bf88318bcd452363c",
"sha256:b1f09b6821406ea1f94053f346f28f8215e293344209129a9c0fcc3578598d7b", "sha256:ecf6d4cda1f9f6cb0b45803a01ea7f034e2f1aed9475e883410812d9f9e3cfcf",
"sha256:b2ac41acfc8d965fb0c464eb8f44995770239668956dc4cdf502d1b1ffe0d747", "sha256:f2a159111a0f58fb034c93eeba211b4141137ec4b0a6e75789ab7a3ef3c7e7e3",
"sha256:b46fa6eae1cd1c20e6e6f44e19984d438b6b2d8616d21d783d150df714f44078", "sha256:f37c0caf14b9e9b9e8f6dbc81bc56db06acb4363eba5a633167781a48ef036ed",
"sha256:b50eab9994d64f4a823ff99a0ed28a6903224ddbe7fef56a6dd865eec9243440", "sha256:f5693145220517b5f42393e07a6898acdfe820e136c98663b971906120549da5"
"sha256:bfc9064f6658a3d1cadeaa0ba07570b83ce6801a1314985bf98ec9b95d74e15f",
"sha256:c0b0e5e1b5d9f3586601048dd68f392dc0cc99a59bb5faf18aab057ce00d00b2",
"sha256:c153265408d18de4cc5ded1941dcd8315894572cddd3c58df5d5b5705b3fa28d",
"sha256:d4ae769b9c1c7757e4ccce94b0641bc203bbdf43ba7a2413ab2523d8d047d8dc",
"sha256:dc56c9788617b8964ad02e8fcfeed4001c1f8ba91a9e1f31483c0dffb207002a",
"sha256:dd5ec3aa6ae6e4d5b5de9357d2133c07be1aff6405b136dad753a16afb6717dd",
"sha256:edba70118c4be3c2b1f90754d308d0b79c6fe2c0fdc52d8ddf603916f83f4db9",
"sha256:ff8e80c4c4932c10493ff97028decfdb622de69cae87e0f127a7ebe32b4069c6"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.7'", "version": "==2.0.25"
"version": "==2.0.41"
}, },
"tenacity": { "tenacity": {
"hashes": [ "hashes": [
"sha256:1169d376c297e7de388d18b4481760d478b0e99a777cad3a9c86e556f4b697cb", "sha256:5398ef0d78e63f40007c1fb4c0bff96e1911394d2fa8d194f77619c05ff6cc8a",
"sha256:f77bf36710d8b73a50b2dd155c97b870017ad21afe6ab300326b0371b3b05138" "sha256:ce510e327a630c9e1beaf17d42e6ffacc88185044ad85cf74c0a8887c6a0f88c"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.9'", "version": "==8.2.3"
"version": "==9.1.2"
}, },
"typing-extensions": { "typing-extensions": {
"hashes": [ "hashes": [
"sha256:8676b788e32f02ab42d9e7c61324048ae4c6d844a399eebace3d4979d75ceef4", "sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783",
"sha256:a1514509136dd0b477638fc68d6a91497af5076466ad0fa6c338e44e359944af" "sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd"
], ],
"markers": "python_version >= '3.9'", "markers": "python_version >= '3.8'",
"version": "==4.14.0" "version": "==4.9.0"
} }
}, },
"develop": { "develop": {
"autopep8": { "autopep8": {
"hashes": [ "hashes": [
"sha256:8d6c87eba648fdcfc83e29b788910b8643171c395d9c4bcf115ece035b9c9dda", "sha256:067959ca4a07b24dbd5345efa8325f5f58da4298dab0dde0443d5ed765de80cb",
"sha256:a203fe0fcad7939987422140ab17a930f684763bf7335bdb6709991dd7ef6c2d" "sha256:2913064abd97b3419d1cc83ea71f042cb821f87e45b9c88cad5ad3c4ea87fe0c"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.8'", "version": "==2.0.4"
"version": "==2.3.1"
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [
"sha256:049d058491e228e03e67b390f311bbf88fce2dbaa8fa673e7aea87b7198b8d38", "sha256:33f96621059e65eec474169085dc92bf26e7b2d47366b70be2f67ab80dc25132",
"sha256:597477df7860daa5aa0fdd84bf5208a043ab96b8e96ab708770ae0364dd03213" "sha256:a6dfbb75e03252917f2473ea9653f7cd799c3064e54d4c8140044c5c065f53c3"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_full_version >= '3.8.1'", "version": "==7.0.0"
"version": "==7.1.1"
}, },
"mccabe": { "mccabe": {
"hashes": [ "hashes": [
@ -136,11 +187,11 @@
}, },
"pycodestyle": { "pycodestyle": {
"hashes": [ "hashes": [
"sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3", "sha256:41ba0e7afc9752dfb53ced5489e89f8186be00e599e712660695b7a75ff2663f",
"sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521" "sha256:44fe31000b2d866f2e41841b18528a505fbd7fef9017b04eff4e2648a0fadc67"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==2.12.1" "version": "==2.11.1"
}, },
"pyflakes": { "pyflakes": {
"hashes": [ "hashes": [
@ -152,11 +203,11 @@
}, },
"tomli": { "tomli": {
"hashes": [ "hashes": [
"sha256:2ebe24485c53d303f690b0ec092806a085f07af5a5aa1464f3931eec36caaa38", "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc",
"sha256:d46d457a85337051c36524bc5349dd91b1877838e2979ac5ced3e710ed8a60ed" "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"
], ],
"markers": "python_version < '3.11'", "markers": "python_version < '3.11'",
"version": "==2.0.2" "version": "==2.0.1"
} }
} }
} }

View File

@ -1,4 +1,4 @@
FROM python:3.9 FROM python:3.9-bullseye
ENV TZ="Asia/Tokyo" ENV TZ="Asia/Tokyo"

View File

@ -16,115 +16,166 @@
] ]
}, },
"default": { "default": {
"greenlet": {
"hashes": [
"sha256:01bc7ea167cf943b4c802068e178bbf70ae2e8c080467070d01bfa02f337ee67",
"sha256:0448abc479fab28b00cb472d278828b3ccca164531daab4e970a0458786055d6",
"sha256:086152f8fbc5955df88382e8a75984e2bb1c892ad2e3c80a2508954e52295257",
"sha256:098d86f528c855ead3479afe84b49242e174ed262456c342d70fc7f972bc13c4",
"sha256:149e94a2dd82d19838fe4b2259f1b6b9957d5ba1b25640d2380bea9c5df37676",
"sha256:1551a8195c0d4a68fac7a4325efac0d541b48def35feb49d803674ac32582f61",
"sha256:15d79dd26056573940fcb8c7413d84118086f2ec1a8acdfa854631084393efcc",
"sha256:1996cb9306c8595335bb157d133daf5cf9f693ef413e7673cb07e3e5871379ca",
"sha256:1a7191e42732df52cb5f39d3527217e7ab73cae2cb3694d241e18f53d84ea9a7",
"sha256:1ea188d4f49089fc6fb283845ab18a2518d279c7cd9da1065d7a84e991748728",
"sha256:1f672519db1796ca0d8753f9e78ec02355e862d0998193038c7073045899f305",
"sha256:2516a9957eed41dd8f1ec0c604f1cdc86758b587d964668b5b196a9db5bfcde6",
"sha256:2797aa5aedac23af156bbb5a6aa2cd3427ada2972c828244eb7d1b9255846379",
"sha256:2dd6e660effd852586b6a8478a1d244b8dc90ab5b1321751d2ea15deb49ed414",
"sha256:3ddc0f794e6ad661e321caa8d2f0a55ce01213c74722587256fb6566049a8b04",
"sha256:3ed7fb269f15dc662787f4119ec300ad0702fa1b19d2135a37c2c4de6fadfd4a",
"sha256:419b386f84949bf0e7c73e6032e3457b82a787c1ab4a0e43732898a761cc9dbf",
"sha256:43374442353259554ce33599da8b692d5aa96f8976d567d4badf263371fbe491",
"sha256:52f59dd9c96ad2fc0d5724107444f76eb20aaccb675bf825df6435acb7703559",
"sha256:57e8974f23e47dac22b83436bdcf23080ade568ce77df33159e019d161ce1d1e",
"sha256:5b51e85cb5ceda94e79d019ed36b35386e8c37d22f07d6a751cb659b180d5274",
"sha256:649dde7de1a5eceb258f9cb00bdf50e978c9db1b996964cd80703614c86495eb",
"sha256:64d7675ad83578e3fc149b617a444fab8efdafc9385471f868eb5ff83e446b8b",
"sha256:68834da854554926fbedd38c76e60c4a2e3198c6fbed520b106a8986445caaf9",
"sha256:6b66c9c1e7ccabad3a7d037b2bcb740122a7b17a53734b7d72a344ce39882a1b",
"sha256:70fb482fdf2c707765ab5f0b6655e9cfcf3780d8d87355a063547b41177599be",
"sha256:7170375bcc99f1a2fbd9c306f5be8764eaf3ac6b5cb968862cad4c7057756506",
"sha256:73a411ef564e0e097dbe7e866bb2dda0f027e072b04da387282b02c308807405",
"sha256:77457465d89b8263bca14759d7c1684df840b6811b2499838cc5b040a8b5b113",
"sha256:7f362975f2d179f9e26928c5b517524e89dd48530a0202570d55ad6ca5d8a56f",
"sha256:81bb9c6d52e8321f09c3d165b2a78c680506d9af285bfccbad9fb7ad5a5da3e5",
"sha256:881b7db1ebff4ba09aaaeae6aa491daeb226c8150fc20e836ad00041bcb11230",
"sha256:894393ce10ceac937e56ec00bb71c4c2f8209ad516e96033e4b3b1de270e200d",
"sha256:99bf650dc5d69546e076f413a87481ee1d2d09aaaaaca058c9251b6d8c14783f",
"sha256:9da2bd29ed9e4f15955dd1595ad7bc9320308a3b766ef7f837e23ad4b4aac31a",
"sha256:afaff6cf5200befd5cec055b07d1c0a5a06c040fe5ad148abcd11ba6ab9b114e",
"sha256:b1b5667cced97081bf57b8fa1d6bfca67814b0afd38208d52538316e9422fc61",
"sha256:b37eef18ea55f2ffd8f00ff8fe7c8d3818abd3e25fb73fae2ca3b672e333a7a6",
"sha256:b542be2440edc2d48547b5923c408cbe0fc94afb9f18741faa6ae970dbcb9b6d",
"sha256:b7dcbe92cc99f08c8dd11f930de4d99ef756c3591a5377d1d9cd7dd5e896da71",
"sha256:b7f009caad047246ed379e1c4dbcb8b020f0a390667ea74d2387be2998f58a22",
"sha256:bba5387a6975598857d86de9eac14210a49d554a77eb8261cc68b7d082f78ce2",
"sha256:c5e1536de2aad7bf62e27baf79225d0d64360d4168cf2e6becb91baf1ed074f3",
"sha256:c5ee858cfe08f34712f548c3c363e807e7186f03ad7a5039ebadb29e8c6be067",
"sha256:c9db1c18f0eaad2f804728c67d6c610778456e3e1cc4ab4bbd5eeb8e6053c6fc",
"sha256:d353cadd6083fdb056bb46ed07e4340b0869c305c8ca54ef9da3421acbdf6881",
"sha256:d46677c85c5ba00a9cb6f7a00b2bfa6f812192d2c9f7d9c4f6a55b60216712f3",
"sha256:d4d1ac74f5c0c0524e4a24335350edad7e5f03b9532da7ea4d3c54d527784f2e",
"sha256:d73a9fe764d77f87f8ec26a0c85144d6a951a6c438dfe50487df5595c6373eac",
"sha256:da70d4d51c8b306bb7a031d5cff6cc25ad253affe89b70352af5f1cb68e74b53",
"sha256:daf3cb43b7cf2ba96d614252ce1684c1bccee6b2183a01328c98d36fcd7d5cb0",
"sha256:dca1e2f3ca00b84a396bc1bce13dd21f680f035314d2379c4160c98153b2059b",
"sha256:dd4f49ae60e10adbc94b45c0b5e6a179acc1736cf7a90160b404076ee283cf83",
"sha256:e1f145462f1fa6e4a4ae3c0f782e580ce44d57c8f2c7aae1b6fa88c0b2efdb41",
"sha256:e3391d1e16e2a5a1507d83e4a8b100f4ee626e8eca43cf2cadb543de69827c4c",
"sha256:fcd2469d6a2cf298f198f0487e0a5b1a47a42ca0fa4dfd1b6862c999f018ebbf",
"sha256:fd096eb7ffef17c456cfa587523c5f92321ae02427ff955bebe9e3c63bc9f0da",
"sha256:fe754d231288e1e64323cfad462fcee8f0288654c10bdf4f603a39ed923bef33"
],
"markers": "platform_machine == 'aarch64' or (platform_machine == 'ppc64le' or (platform_machine == 'x86_64' or (platform_machine == 'amd64' or (platform_machine == 'AMD64' or (platform_machine == 'win32' or platform_machine == 'WIN32')))))",
"version": "==3.0.3"
},
"pymysql": { "pymysql": {
"hashes": [ "hashes": [
"sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c", "sha256:4f13a7df8bf36a51e81dd9f3605fede45a4878fe02f9236349fd82a3f0612f96",
"sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0" "sha256:8969ec6d763c856f7073c4c64662882675702efcb114b4bcbb955aea3a069fa7"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.7'", "version": "==1.1.0"
"version": "==1.1.1"
}, },
"sqlalchemy": { "sqlalchemy": {
"hashes": [ "hashes": [
"sha256:023b3ee6169969beea3bb72312e44d8b7c27c75b347942d943cf49397b7edeb5", "sha256:0d3cab3076af2e4aa5693f89622bef7fa770c6fec967143e4da7508b3dceb9b9",
"sha256:03968a349db483936c249f4d9cd14ff2c296adfa1290b660ba6516f973139582", "sha256:0dacf67aee53b16f365c589ce72e766efaabd2b145f9de7c917777b575e3659d",
"sha256:05132c906066142103b83d9c250b60508af556982a385d96c4eaa9fb9720ac2b", "sha256:10331f129982a19df4284ceac6fe87353ca3ca6b4ca77ff7d697209ae0a5915e",
"sha256:087b6b52de812741c27231b5a3586384d60c353fbd0e2f81405a814b5591dc8b", "sha256:14a6f68e8fc96e5e8f5647ef6cda6250c780612a573d99e4d881581432ef1669",
"sha256:0b3dbf1e7e9bc95f4bac5e2fb6d3fb2f083254c3fdd20a1789af965caf2d2348", "sha256:1b1180cda6df7af84fe72e4530f192231b1f29a7496951db4ff38dac1687202d",
"sha256:118c16cd3f1b00c76d69343e38602006c9cfb9998fa4f798606d28d63f23beda", "sha256:29049e2c299b5ace92cbed0c1610a7a236f3baf4c6b66eb9547c01179f638ec5",
"sha256:1936af879e3db023601196a1684d28e12f19ccf93af01bf3280a3262c4b6b4e5", "sha256:342d365988ba88ada8af320d43df4e0b13a694dbd75951f537b2d5e4cb5cd002",
"sha256:1e3f196a0c59b0cae9a0cd332eb1a4bda4696e863f4f1cf84ab0347992c548c2", "sha256:420362338681eec03f53467804541a854617faed7272fe71a1bfdb07336a381e",
"sha256:23a8825495d8b195c4aa9ff1c430c28f2c821e8c5e2d98089228af887e5d7e29", "sha256:4344d059265cc8b1b1be351bfb88749294b87a8b2bbe21dfbe066c4199541ebd",
"sha256:293cd444d82b18da48c9f71cd7005844dbbd06ca19be1ccf6779154439eec0b8", "sha256:4f7a7d7fcc675d3d85fbf3b3828ecd5990b8d61bd6de3f1b260080b3beccf215",
"sha256:32f9dc8c44acdee06c8fc6440db9eae8b4af8b01e4b1aee7bdd7241c22edff4f", "sha256:555651adbb503ac7f4cb35834c5e4ae0819aab2cd24857a123370764dc7d7e24",
"sha256:34ea30ab3ec98355235972dadc497bb659cc75f8292b760394824fab9cf39826", "sha256:59a21853f5daeb50412d459cfb13cb82c089ad4c04ec208cd14dddd99fc23b39",
"sha256:3d3549fc3e40667ec7199033a4e40a2f669898a00a7b18a931d3efb4c7900504", "sha256:5fdd402169aa00df3142149940b3bf9ce7dde075928c1886d9a1df63d4b8de62",
"sha256:41836fe661cc98abfae476e14ba1906220f92c4e528771a8a3ae6a151242d2ae", "sha256:605b6b059f4b57b277f75ace81cc5bc6335efcbcc4ccb9066695e515dbdb3900",
"sha256:4d44522480e0bf34c3d63167b8cfa7289c1c54264c2950cc5fc26e7850967e45", "sha256:665f0a3954635b5b777a55111ababf44b4fc12b1f3ba0a435b602b6387ffd7cf",
"sha256:4eeb195cdedaf17aab6b247894ff2734dcead6c08f748e617bfe05bd5a218443", "sha256:6f9e2e59cbcc6ba1488404aad43de005d05ca56e069477b33ff74e91b6319735",
"sha256:4f67766965996e63bb46cfbf2ce5355fc32d9dd3b8ad7e536a920ff9ee422e23", "sha256:736ea78cd06de6c21ecba7416499e7236a22374561493b456a1f7ffbe3f6cdb4",
"sha256:57df5dc6fdb5ed1a88a1ed2195fd31927e705cad62dedd86b46972752a80f576", "sha256:74b080c897563f81062b74e44f5a72fa44c2b373741a9ade701d5f789a10ba23",
"sha256:598d9ebc1e796431bbd068e41e4de4dc34312b7aa3292571bb3674a0cb415dd1", "sha256:75432b5b14dc2fff43c50435e248b45c7cdadef73388e5610852b95280ffd0e9",
"sha256:5b14e97886199c1f52c14629c11d90c11fbb09e9334fa7bb5f6d068d9ced0ce0", "sha256:75f99202324383d613ddd1f7455ac908dca9c2dd729ec8584c9541dd41822a2c",
"sha256:5e22575d169529ac3e0a120cf050ec9daa94b6a9597993d1702884f6954a7d71", "sha256:790f533fa5c8901a62b6fef5811d48980adeb2f51f1290ade8b5e7ba990ba3de",
"sha256:60c578c45c949f909a4026b7807044e7e564adf793537fc762b2489d522f3d11", "sha256:798f717ae7c806d67145f6ae94dc7c342d3222d3b9a311a784f371a4333212c7",
"sha256:6145afea51ff0af7f2564a05fa95eb46f542919e6523729663a5d285ecb3cf5e", "sha256:7c88f0c7dcc5f99bdb34b4fd9b69b93c89f893f454f40219fe923a3a2fd11625",
"sha256:6375cd674fe82d7aa9816d1cb96ec592bac1726c11e0cafbf40eeee9a4516b5f", "sha256:7d505815ac340568fd03f719446a589162d55c52f08abd77ba8964fbb7eb5b5f",
"sha256:6854175807af57bdb6425e47adbce7d20a4d79bbfd6f6d6519cd10bb7109a7f8", "sha256:84daa0a2055df9ca0f148a64fdde12ac635e30edbca80e87df9b3aaf419e144a",
"sha256:6ab60a5089a8f02009f127806f777fca82581c49e127f08413a66056bd9166dd", "sha256:87d91043ea0dc65ee583026cb18e1b458d8ec5fc0a93637126b5fc0bc3ea68c4",
"sha256:725875a63abf7c399d4548e686debb65cdc2549e1825437096a0af1f7e374814", "sha256:87f6e732bccd7dcf1741c00f1ecf33797383128bd1c90144ac8adc02cbb98643",
"sha256:7492967c3386df69f80cf67efd665c0f667cee67032090fe01d7d74b0e19bb08", "sha256:884272dcd3ad97f47702965a0e902b540541890f468d24bd1d98bcfe41c3f018",
"sha256:81965cc20848ab06583506ef54e37cf15c83c7e619df2ad16807c03100745dea", "sha256:8b8cb63d3ea63b29074dcd29da4dc6a97ad1349151f2d2949495418fd6e48db9",
"sha256:81c24e0c0fde47a9723c81d5806569cddef103aebbf79dbc9fcbb617153dea30", "sha256:91f7d9d1c4dd1f4f6e092874c128c11165eafcf7c963128f79e28f8445de82d5",
"sha256:81eedafa609917040d39aa9332e25881a8e7a0862495fcdf2023a9667209deda", "sha256:a2c69a7664fb2d54b8682dd774c3b54f67f84fa123cf84dda2a5f40dcaa04e08",
"sha256:81f413674d85cfd0dfcd6512e10e0f33c19c21860342a4890c3a2b59479929f9", "sha256:a3be4987e3ee9d9a380b66393b77a4cd6d742480c951a1c56a23c335caca4ce3",
"sha256:8280856dd7c6a68ab3a164b4a4b1c51f7691f6d04af4d4ca23d6ecf2261b7923", "sha256:a86b4240e67d4753dc3092d9511886795b3c2852abe599cffe108952f7af7ac3",
"sha256:82ca366a844eb551daff9d2e6e7a9e5e76d2612c8564f58db6c19a726869c1df", "sha256:aa9373708763ef46782d10e950b49d0235bfe58facebd76917d3f5cbf5971aed",
"sha256:8b4af17bda11e907c51d10686eda89049f9ce5669b08fbe71a29747f1e876036", "sha256:b64b183d610b424a160b0d4d880995e935208fc043d0302dd29fee32d1ee3f95",
"sha256:90144d3b0c8b139408da50196c5cad2a6909b51b23df1f0538411cd23ffa45d3", "sha256:b801154027107461ee992ff4b5c09aa7cc6ec91ddfe50d02bca344918c3265c6",
"sha256:906e6b0d7d452e9a98e5ab8507c0da791856b2380fdee61b765632bb8698026f", "sha256:bb209a73b8307f8fe4fe46f6ad5979649be01607f11af1eb94aa9e8a3aaf77f0",
"sha256:90c11ceb9a1f482c752a71f203a81858625d8df5746d787a4786bca4ffdf71c6", "sha256:bc8b7dabe8e67c4832891a5d322cec6d44ef02f432b4588390017f5cec186a84",
"sha256:911cc493ebd60de5f285bcae0491a60b4f2a9f0f5c270edd1c4dbaef7a38fc04", "sha256:c51db269513917394faec5e5c00d6f83829742ba62e2ac4fa5c98d58be91662f",
"sha256:9a420a91913092d1e20c86a2f5f1fc85c1a8924dbcaf5e0586df8aceb09c9cc2", "sha256:c55731c116806836a5d678a70c84cb13f2cedba920212ba7dcad53260997666d",
"sha256:9f8c9fdd15a55d9465e590a402f42082705d66b05afc3ffd2d2eb3c6ba919560", "sha256:cf18ff7fc9941b8fc23437cc3e68ed4ebeff3599eec6ef5eebf305f3d2e9a7c2",
"sha256:a104c5694dfd2d864a6f91b0956eb5d5883234119cb40010115fd45a16da5e70", "sha256:d24f571990c05f6b36a396218f251f3e0dda916e0c687ef6fdca5072743208f5",
"sha256:a373a400f3e9bac95ba2a06372c4fd1412a7cee53c37fc6c05f829bf672b8769", "sha256:db854730a25db7c956423bb9fb4bdd1216c839a689bf9cc15fada0a7fb2f4570",
"sha256:a62448526dd9ed3e3beedc93df9bb6b55a436ed1474db31a2af13b313a70a7e1", "sha256:dc55990143cbd853a5d038c05e79284baedf3e299661389654551bd02a6a68d7",
"sha256:a8808d5cf866c781150d36a3c8eb3adccfa41a8105d031bf27e92c251e3969d6", "sha256:e607cdd99cbf9bb80391f54446b86e16eea6ad309361942bf88318bcd452363c",
"sha256:b1f09b6821406ea1f94053f346f28f8215e293344209129a9c0fcc3578598d7b", "sha256:ecf6d4cda1f9f6cb0b45803a01ea7f034e2f1aed9475e883410812d9f9e3cfcf",
"sha256:b2ac41acfc8d965fb0c464eb8f44995770239668956dc4cdf502d1b1ffe0d747", "sha256:f2a159111a0f58fb034c93eeba211b4141137ec4b0a6e75789ab7a3ef3c7e7e3",
"sha256:b46fa6eae1cd1c20e6e6f44e19984d438b6b2d8616d21d783d150df714f44078", "sha256:f37c0caf14b9e9b9e8f6dbc81bc56db06acb4363eba5a633167781a48ef036ed",
"sha256:b50eab9994d64f4a823ff99a0ed28a6903224ddbe7fef56a6dd865eec9243440", "sha256:f5693145220517b5f42393e07a6898acdfe820e136c98663b971906120549da5"
"sha256:bfc9064f6658a3d1cadeaa0ba07570b83ce6801a1314985bf98ec9b95d74e15f",
"sha256:c0b0e5e1b5d9f3586601048dd68f392dc0cc99a59bb5faf18aab057ce00d00b2",
"sha256:c153265408d18de4cc5ded1941dcd8315894572cddd3c58df5d5b5705b3fa28d",
"sha256:d4ae769b9c1c7757e4ccce94b0641bc203bbdf43ba7a2413ab2523d8d047d8dc",
"sha256:dc56c9788617b8964ad02e8fcfeed4001c1f8ba91a9e1f31483c0dffb207002a",
"sha256:dd5ec3aa6ae6e4d5b5de9357d2133c07be1aff6405b136dad753a16afb6717dd",
"sha256:edba70118c4be3c2b1f90754d308d0b79c6fe2c0fdc52d8ddf603916f83f4db9",
"sha256:ff8e80c4c4932c10493ff97028decfdb622de69cae87e0f127a7ebe32b4069c6"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.7'", "version": "==2.0.25"
"version": "==2.0.41"
}, },
"tenacity": { "tenacity": {
"hashes": [ "hashes": [
"sha256:1169d376c297e7de388d18b4481760d478b0e99a777cad3a9c86e556f4b697cb", "sha256:5398ef0d78e63f40007c1fb4c0bff96e1911394d2fa8d194f77619c05ff6cc8a",
"sha256:f77bf36710d8b73a50b2dd155c97b870017ad21afe6ab300326b0371b3b05138" "sha256:ce510e327a630c9e1beaf17d42e6ffacc88185044ad85cf74c0a8887c6a0f88c"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.9'", "version": "==8.2.3"
"version": "==9.1.2"
}, },
"typing-extensions": { "typing-extensions": {
"hashes": [ "hashes": [
"sha256:8676b788e32f02ab42d9e7c61324048ae4c6d844a399eebace3d4979d75ceef4", "sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783",
"sha256:a1514509136dd0b477638fc68d6a91497af5076466ad0fa6c338e44e359944af" "sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd"
], ],
"markers": "python_version >= '3.9'", "markers": "python_version >= '3.8'",
"version": "==4.14.0" "version": "==4.9.0"
} }
}, },
"develop": { "develop": {
"autopep8": { "autopep8": {
"hashes": [ "hashes": [
"sha256:8d6c87eba648fdcfc83e29b788910b8643171c395d9c4bcf115ece035b9c9dda", "sha256:067959ca4a07b24dbd5345efa8325f5f58da4298dab0dde0443d5ed765de80cb",
"sha256:a203fe0fcad7939987422140ab17a930f684763bf7335bdb6709991dd7ef6c2d" "sha256:2913064abd97b3419d1cc83ea71f042cb821f87e45b9c88cad5ad3c4ea87fe0c"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_version >= '3.8'", "version": "==2.0.4"
"version": "==2.3.1"
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [
"sha256:049d058491e228e03e67b390f311bbf88fce2dbaa8fa673e7aea87b7198b8d38", "sha256:33f96621059e65eec474169085dc92bf26e7b2d47366b70be2f67ab80dc25132",
"sha256:597477df7860daa5aa0fdd84bf5208a043ab96b8e96ab708770ae0364dd03213" "sha256:a6dfbb75e03252917f2473ea9653f7cd799c3064e54d4c8140044c5c065f53c3"
], ],
"index": "pypi", "index": "pypi",
"markers": "python_full_version >= '3.8.1'", "version": "==7.0.0"
"version": "==7.1.1"
}, },
"mccabe": { "mccabe": {
"hashes": [ "hashes": [
@ -136,11 +187,11 @@
}, },
"pycodestyle": { "pycodestyle": {
"hashes": [ "hashes": [
"sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3", "sha256:41ba0e7afc9752dfb53ced5489e89f8186be00e599e712660695b7a75ff2663f",
"sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521" "sha256:44fe31000b2d866f2e41841b18528a505fbd7fef9017b04eff4e2648a0fadc67"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==2.12.1" "version": "==2.11.1"
}, },
"pyflakes": { "pyflakes": {
"hashes": [ "hashes": [
@ -152,11 +203,11 @@
}, },
"tomli": { "tomli": {
"hashes": [ "hashes": [
"sha256:2ebe24485c53d303f690b0ec092806a085f07af5a5aa1464f3931eec36caaa38", "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc",
"sha256:d46d457a85337051c36524bc5349dd91b1877838e2979ac5ced3e710ed8a60ed" "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"
], ],
"markers": "python_version < '3.11'", "markers": "python_version < '3.11'",
"version": "==2.0.2" "version": "==2.0.1"
} }
} }
} }

View File

@ -1,4 +1,4 @@
FROM python:3.9 FROM python:3.9-bullseye
ENV TZ="Asia/Tokyo" ENV TZ="Asia/Tokyo"

View File

@ -19,11 +19,11 @@
"develop": { "develop": {
"autopep8": { "autopep8": {
"hashes": [ "hashes": [
"sha256:1fa8964e4618929488f4ec36795c7ff12924a68b8bf01366c094fc52f770b6e7", "sha256:067959ca4a07b24dbd5345efa8325f5f58da4298dab0dde0443d5ed765de80cb",
"sha256:2bb76888c5edbcafe6aabab3c47ba534f5a2c2d245c2eddced4a30c4b4946357" "sha256:2913064abd97b3419d1cc83ea71f042cb821f87e45b9c88cad5ad3c4ea87fe0c"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.1.0" "version": "==2.0.4"
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [

View File

@ -26,7 +26,6 @@ openpyxl = "*"
xlrd = "*" xlrd = "*"
sqlalchemy = "==2.*" sqlalchemy = "==2.*"
mojimoji = "*" mojimoji = "*"
numpy = "==2.0.*"
[dev-packages] [dev-packages]
autopep8 = "*" autopep8 = "*"

File diff suppressed because it is too large Load Diff

View File

@ -84,8 +84,6 @@
│   ├── exception_handler.py -- FastAPI内部でエラー発生時のハンドリング │   ├── exception_handler.py -- FastAPI内部でエラー発生時のハンドリング
│   └── exceptions.py -- カスタム例外クラス │   └── exceptions.py -- カスタム例外クラス
├── main.py -- APサーバーのエントリーポイント。ここでルーターやハンドラーの登録を行う ├── main.py -- APサーバーのエントリーポイント。ここでルーターやハンドラーの登録を行う
├── middleware -- ミドルウェアの設定
│ └── middleware.py
├── model -- モデル層(MVCのM) ├── model -- モデル層(MVCのM)
│   ├── db -- リポジトリから返されるDBレコードのモデル │   ├── db -- リポジトリから返されるDBレコードのモデル
│   │   ├── base_db_model.py │   │   ├── base_db_model.py
@ -197,39 +195,3 @@
- コントローラーのrouter変数が、`router.route_class = Authenticate`となっている場合、以下の動きをする - コントローラーのrouter変数が、`router.route_class = Authenticate`となっている場合、以下の動きをする
- リクエスト到達時にセッションの有無をチェックする - リクエスト到達時にセッションの有無をチェックする
- レスポンス時、クッキーにセッションキーを登録する - レスポンス時、クッキーにセッションキーを登録する
## HTMLで読み込んでいるスクリプトのSRIハッシュ値を生成・設定する方法
### サブリソース完全性 (Subresource Integrity, SRI) とは
CDN などから取得したリソースが意図せず改ざんされていないかをブラウザーが検証するセキュリティ機能です。 SRI を利用する際には、取得したリソースのハッシュ値と一致すべきハッシュ値を指定します。
詳細:<https://developer.mozilla.org/ja/docs/Web/Security/Subresource_Integrity>
実消化&アルトマークのWebアプリケーションでは、複数の外部スクリプトを読み込んで動作しているため、読み込むスクリプトを変更した場合は、
タグの属性値`integrity`に設定されているスクリプトのハッシュ値を更新する必要がある。
### SRI ハッシュ値の生成方法(サーバー内のスクリプトについて)
- サーバー内に保管されているスクリプトを更新した場合、Linux環境WSL2でも可で、以下のコマンドを実行し、ハッシュ値を生成する
```bash
cat <更新したスクリプトファイル名> | openssl dgst -sha384 -binary | openssl base64 -A
```
参考:<https://developer.mozilla.org/ja/docs/Web/Security/Subresource_Integrity#sri_%E3%83%8F%E3%83%83%E3%82%B7%E3%83%A5%E3%82%92%E7%94%9F%E6%88%90%E3%81%99%E3%82%8B%E3%83%84%E3%83%BC%E3%83%AB>
### SRI ハッシュ値の生成方法(外部サイトから読み込んでいるスクリプトについて)
- 外部サイトから読み込んでいるスクリプトを更新した場合、下記のMDNオンラインツールでハッシュ値を生成する
- [SRI Hash Generator](https://www.srihash.org/)
### SRI ハッシュ値の設定方法
- 更新したスクリプトを読み込んでいる箇所の`integrity`属性値を、生成したハッシュ値に置き換える
- 以下は設定のサンプル
```bash
<script src="https://リンク/スクリプト.js" integrity="sha384-生成したハッシュ" crossorigin="anonymous"></script>
```

View File

@ -79,11 +79,11 @@ def search_bio_data(
'data': data, 'data': data,
'count': bio_sales_lot_count 'count': bio_sales_lot_count
}) })
# クッキーも書き換え # クッキーも書き換え
json_response.set_cookie( json_response.set_cookie(
key='session', key='session',
value=session.session_key, value=session.session_key,
max_age=environment.SESSION_EXPIRE_MINUTE * 60, # cookieの有効期限は秒数指定なので、60秒をかける
secure=True, secure=True,
httponly=True httponly=True
) )
@ -153,10 +153,10 @@ async def download_bio_data(
'status': 'ok', 'status': 'ok',
'download_url': download_file_url 'download_url': download_file_url
}) })
json_response.set_cookie( json_response.set_cookie(
key='session', key='session',
value=session.session_key, value=session.session_key,
max_age=environment.SESSION_EXPIRE_MINUTE * 60, # cookieの有効期限は秒数指定なので、60秒をかける
secure=True, secure=True,
httponly=True httponly=True
) )

View File

@ -70,22 +70,11 @@ def login(
jwt_token = login_service.login(request.username, request.password) jwt_token = login_service.login(request.username, request.password)
except NotAuthorizeException as e: except NotAuthorizeException as e:
logger.info(f'ログイン失敗:{e}') logger.info(f'ログイン失敗:{e}')
# ログイン失敗回数をカウント
login_service.increase_login_failed_count(request.username)
# ログイン失敗回数を超過した場合はメッセージを変える
if login_service.is_login_failed_limit_exceeded(request.username):
login_service.on_login_fail_limit_exceeded(request.username)
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=constants.LOGOUT_REASON_LOGIN_FAILED_LIMIT_EXCEEDED)
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=constants.LOGOUT_REASON_LOGIN_ERROR) raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=constants.LOGOUT_REASON_LOGIN_ERROR)
except JWTTokenVerifyException as e: except JWTTokenVerifyException as e:
logger.info(f'ログイン失敗:{e}') logger.info(f'ログイン失敗:{e}')
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED) raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED)
# ログイン成功問わず、DBのログイン失敗回数が10回以上あれば、ログアウト画面にリダイレクトする
if login_service.is_login_failed_limit_exceeded(request.username):
logger.info(f'ログイン失敗回数が10回以上: {request.username}')
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=constants.LOGOUT_REASON_LOGIN_FAILED_LIMIT_EXCEEDED)
verified_token = jwt_token.verify_token() verified_token = jwt_token.verify_token()
# 普通の認証だと、`cognito:username`に入る。 # 普通の認証だと、`cognito:username`に入る。
user_id = verified_token.user_id user_id = verified_token.user_id
@ -124,7 +113,6 @@ def login(
status_code=status.HTTP_303_SEE_OTHER, status_code=status.HTTP_303_SEE_OTHER,
headers={'session_key': session_key} headers={'session_key': session_key}
) )
return response return response
@ -182,5 +170,4 @@ def sso_authorize(
status_code=status.HTTP_303_SEE_OTHER, status_code=status.HTTP_303_SEE_OTHER,
headers={'session_key': session_key} headers={'session_key': session_key}
) )
return response return response

View File

@ -8,7 +8,6 @@ from src.model.internal.session import UserSession
from src.model.view.logout_view_model import LogoutViewModel from src.model.view.logout_view_model import LogoutViewModel
from src.system_var import constants from src.system_var import constants
from src.templates import templates from src.templates import templates
from src.services import session_service
router = APIRouter() router = APIRouter()
@ -17,7 +16,6 @@ router = APIRouter()
######################### #########################
@router.get('/', response_class=HTMLResponse) @router.get('/', response_class=HTMLResponse)
def logout_view( def logout_view(
request: Request, request: Request,
@ -49,9 +47,4 @@ def logout_view(
) )
# クッキーを削除 # クッキーを削除
template_response.delete_cookie('session') template_response.delete_cookie('session')
# セッション削除
if session:
session_service.delete_session(session)
return template_response return template_response

View File

@ -270,7 +270,6 @@ def inst_emp_csv_download(
ta_cd=csv_download_form.ta_cd, ta_cd=csv_download_form.ta_cd,
inst_cd=csv_download_form.inst_cd, inst_cd=csv_download_form.inst_cd,
emp_cd=csv_download_form.emp_cd, emp_cd=csv_download_form.emp_cd,
emp_chg_type_cd=csv_download_form.emp_chg_type_cd,
apply_date_from=csv_download_form.apply_date_from, apply_date_from=csv_download_form.apply_date_from,
start_date_from=csv_download_form.start_date_from, start_date_from=csv_download_form.start_date_from,
start_date_to=csv_download_form.start_date_to, start_date_to=csv_download_form.start_date_to,

View File

@ -189,7 +189,7 @@ class DatabaseClient:
self.__session = None self.__session = None
def to_jst(self): def to_jst(self):
self.execute('SET SESSION time_zone = "Asia/Tokyo"') self.execute('SET time_zone = "+9:00"')
def __execute_with_transaction(self, query: str, parameters: dict): def __execute_with_transaction(self, query: str, parameters: dict):
# トランザクションを開始してクエリを実行する # トランザクションを開始してクエリを実行する

View File

@ -10,9 +10,8 @@ from src.controller import (bio, bio_api, healthcheck, login, logout,
from src.core import task from src.core import task
from src.error.exception_handler import http_exception_handler from src.error.exception_handler import http_exception_handler
from src.error.exceptions import UnexpectedException from src.error.exceptions import UnexpectedException
from src.middleware.middleware import SecurityHeadersMiddleware
app = FastAPI(openapi_url=None) app = FastAPI()
# 静的ファイルをマウント # 静的ファイルをマウント
app.mount('/static', StaticFiles(directory=path.dirname(static.__file__)), name='static') app.mount('/static', StaticFiles(directory=path.dirname(static.__file__)), name='static')
@ -43,8 +42,5 @@ app.add_exception_handler(status.HTTP_403_FORBIDDEN, http_exception_handler)
# サーバーエラーが発生した場合のハンドラー。HTTPExceptionではハンドリングできないため、個別に設定 # サーバーエラーが発生した場合のハンドラー。HTTPExceptionではハンドリングできないため、個別に設定
app.add_exception_handler(UnexpectedException, http_exception_handler) app.add_exception_handler(UnexpectedException, http_exception_handler)
# セキュリティヘッダー設定はミドルウェアで処理する
app.add_middleware(SecurityHeadersMiddleware)
# サーバー起動時のイベント # サーバー起動時のイベント
app.add_event_handler('startup', task.create_start_app_handler()) app.add_event_handler('startup', task.create_start_app_handler())

View File

@ -1,16 +0,0 @@
from fastapi import Request, Response, status
from fastapi.responses import JSONResponse
from starlette.middleware.base import BaseHTTPMiddleware
class SecurityHeadersMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request, call_next):
response = await call_next(request)
# X-Frame-Optionsヘッダー追加
response.headers['X-Frame-Options'] = 'DENY'
# X-Content-Type-Optionsヘッダー追加
response.headers['X-Content-Type-Options'] = 'nosniff'
# Strict-Transport-Securityヘッダー追加
response.headers['Strict-Transport-Security'] = 'max-age=31536000 includeSubDomains'
# Cache-Controlヘッダー追加
response.headers['Cache-Control'] = 'private'
return response

View File

@ -2,7 +2,7 @@ from datetime import datetime
from typing import Optional from typing import Optional
from src.model.db.base_db_model import BaseDBModel from src.model.db.base_db_model import BaseDBModel
from src.system_var import constants
class UserMasterModel(BaseDBModel): class UserMasterModel(BaseDBModel):
user_id: Optional[str] user_id: Optional[str]
@ -25,8 +25,6 @@ class UserMasterModel(BaseDBModel):
updater: Optional[str] updater: Optional[str]
update_date: Optional[datetime] update_date: Optional[datetime]
mntuser_flg: Optional[str] mntuser_flg: Optional[str]
mntuser_login_failed_cnt: Optional[int]
mntuser_last_login_failed_datetime: Optional[datetime]
def is_enable_user(self): def is_enable_user(self):
return self.enabled_flg == 'Y' return self.enabled_flg == 'Y'
@ -36,6 +34,3 @@ class UserMasterModel(BaseDBModel):
def is_groupware_user(self): def is_groupware_user(self):
return self.mntuser_flg == '0' or self.mntuser_flg is None return self.mntuser_flg == '0' or self.mntuser_flg is None
def is_login_failed_limit_exceeded(self):
return self.mntuser_login_failed_cnt >= constants.LOGIN_FAIL_LIMIT

View File

@ -1,17 +1,16 @@
import csv import csv
import json import json
from abc import ABCMeta, abstractmethod
from datetime import datetime
from io import TextIOWrapper
from src.logging.get_logger import get_logger from io import TextIOWrapper
from src.repositories.bu_master_cd_repository import BuMasterRepository from datetime import datetime
from src.repositories.emp_chg_inst_repository import EmpChgInstRepository from abc import ABCMeta, abstractmethod
from src.repositories.emp_master_repository import EmpMasterRepository
from src.repositories.generic_kbn_mst_repository import GenericKbnMstRepository
from src.repositories.mst_inst_repository import MstInstRepository
from src.system_var import constants from src.system_var import constants
from src.util.string_util import is_not_empty from src.util.string_util import is_not_empty
from src.repositories.mst_inst_repository import MstInstRepository
from src.repositories.bu_master_cd_repository import BuMasterRepository
from src.repositories.emp_master_repository import EmpMasterRepository
from src.repositories.emp_chg_inst_repository import EmpChgInstRepository
from src.logging.get_logger import get_logger
logger = get_logger('マスターメンテ') logger = get_logger('マスターメンテ')
@ -25,7 +24,6 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
emp_master_repository: EmpMasterRepository emp_master_repository: EmpMasterRepository
bu_master_repository: BuMasterRepository bu_master_repository: BuMasterRepository
emp_chginst_repository: EmpChgInstRepository emp_chginst_repository: EmpChgInstRepository
generic_kbn_mst_repository: GenericKbnMstRepository
def __init__( def __init__(
self, self,
@ -35,8 +33,7 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository, emp_chginst_repository: EmpChgInstRepository
generic_kbn_mst_repository: GenericKbnMstRepository
): ):
self.csv_row = csv_row self.csv_row = csv_row
self.table_name = table_name self.table_name = table_name
@ -45,7 +42,6 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
self.emp_master_repository = emp_master_repository self.emp_master_repository = emp_master_repository
self.bu_master_repository = bu_master_repository self.bu_master_repository = bu_master_repository
self.emp_chginst_repository = emp_chginst_repository self.emp_chginst_repository = emp_chginst_repository
self.generic_kbn_mst_repository = generic_kbn_mst_repository
def validate(self) -> list[str]: def validate(self) -> list[str]:
""" """
@ -61,10 +57,6 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
error_list.extend(self.check_require()) error_list.extend(self.check_require())
# 施設コード存在チェック # 施設コード存在チェック
error_list.extend(self.check_inst_cd_exists()) error_list.extend(self.check_inst_cd_exists())
# 領域コード存在チェック
error_list.extend(self.check_ta_cd_exists())
# 担当者種別コード存在チェック
error_list.extend(self.check_emp_chg_type_cd_exists())
# MUID存在チェック # MUID存在チェック
error_list.extend(self.check_emp_cd_exists()) error_list.extend(self.check_emp_cd_exists())
# BuCd存在チェック # BuCd存在チェック
@ -87,7 +79,7 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
return error_list return error_list
def emp_chg_inst_count(self, start_date: str): def emp_chg_inst_count(self, start_date: str):
return self.emp_chginst_repository.fetch_count(self.inst_cd, self.ta_cd, self.emp_chg_type_cd, start_date, self.table_name) return self.emp_chginst_repository.fetch_count(self.inst_cd, self.ta_cd, start_date, self.table_name)
def is_exist_emp_cd(self, start_date: str) -> bool: def is_exist_emp_cd(self, start_date: str) -> bool:
if start_date is None or len(start_date) == 0: if start_date is None or len(start_date) == 0:
@ -99,36 +91,12 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
def is_exist_inst_cd(self) -> bool: def is_exist_inst_cd(self) -> bool:
return True if self.mst_inst_repository.fetch_count(self.inst_cd) > 0 else False return True if self.mst_inst_repository.fetch_count(self.inst_cd) > 0 else False
def is_exist_emp_chg_type_cd(self, start_date: str) -> bool:
if start_date is None or len(start_date) == 0:
return False
if self.generic_kbn_mst_repository.fetch_count('emp_chg_type_cd', self.emp_chg_type_cd, start_date) > 0:
return True
return False
def is_exist_ta_cd(self, start_date: str) -> bool:
if start_date is None or len(start_date) == 0:
return False
if self.generic_kbn_mst_repository.fetch_count('ta_cd', self.ta_cd, start_date) > 0:
return True
return False
def is_exist_bu_cd(self) -> bool: def is_exist_bu_cd(self) -> bool:
return True if self.bu_master_repository.fetch_count(self.bu_cd) > 0 else False return True if self.bu_master_repository.fetch_count(self.bu_cd) > 0 else False
def make_require_error_message(self, line_num: str, col_name: str) -> str: def make_require_error_message(self, line_num: str, col_name: str) -> str:
return f'{line_num}行目の{col_name}が入力されておりません。' return f'{line_num}行目の{col_name}が入力されておりません。'
def make_data_exist_error_message(self, line_num: str, primary_key_col_names: list[str]) -> str:
return self.__make_check_data_exists_error_message(line_num, primary_key_col_names, 'がすべて同一のデータが既に登録されています。')
def make_data_not_exist_error_message(self, line_num: str, primary_key_col_names: list[str]) -> str:
return self.__make_check_data_exists_error_message(line_num, primary_key_col_names, 'がすべて同一のデータが存在しないため更新できません。')
def __make_check_data_exists_error_message(self, line_num: str, primary_key_col_names: list[str], suffix_message: str) -> str:
primary_key_logical_names = ''.join(primary_key_col_names)
return f'{line_num}行目の{primary_key_logical_names}{suffix_message}'
def __parse_str_to_date(self, check_date: str) -> tuple[bool, datetime]: def __parse_str_to_date(self, check_date: str) -> tuple[bool, datetime]:
try: try:
check_date_time: datetime = datetime.strptime(check_date, '%Y%m%d') check_date_time: datetime = datetime.strptime(check_date, '%Y%m%d')
@ -192,18 +160,6 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
pass pass
... ...
@abstractmethod
def check_emp_chg_type_cd_exists(self) -> list[str]:
"""担当者種別コード存在チェック"""
pass
...
@abstractmethod
def check_ta_cd_exists(self) -> list[str]:
"""領域コード存在チェック"""
pass
...
@abstractmethod @abstractmethod
def check_emp_cd_exists(self) -> list[str]: def check_emp_cd_exists(self) -> list[str]:
"""MUID存在チェック""" """MUID存在チェック"""
@ -249,8 +205,7 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository, emp_chginst_repository: EmpChgInstRepository
generic_kbn_mst_repository: GenericKbnMstRepository
): ):
super().__init__( super().__init__(
csv_row, csv_row,
@ -259,13 +214,11 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository, emp_chginst_repository
generic_kbn_mst_repository
) )
self.inst_cd = super().get_csv_value(constants.CSV_NEW_INST_CD_COL_NO) self.inst_cd = super().get_csv_value(constants.CSV_NEW_INST_CD_COL_NO)
self.inst_name = super().get_csv_value(constants.CSV_NEW_INST_NAME_COL_NO) self.inst_name = super().get_csv_value(constants.CSV_NEW_INST_NAME_COL_NO)
self.ta_cd = super().get_csv_value(constants.CSV_NEW_TA_CD_COL_NO) self.ta_cd = super().get_csv_value(constants.CSV_NEW_TA_CD_COL_NO)
self.emp_chg_type_cd = super().get_csv_value(constants.CSV_NEW_EMP_CHG_TYPE_CD_COL_NO)
self.emp_cd = super().get_csv_value(constants.CSV_NEW_EMP_CD_COL_NO) self.emp_cd = super().get_csv_value(constants.CSV_NEW_EMP_CD_COL_NO)
self.emp_name_family = super().get_csv_value(constants.CSV_NEW_EMP_NAME_FAMILY_COL_NO) self.emp_name_family = super().get_csv_value(constants.CSV_NEW_EMP_NAME_FAMILY_COL_NO)
self.emp_name_first = super().get_csv_value(constants.CSV_NEW_EMP_NAME_FIRST_COL_NO) self.emp_name_first = super().get_csv_value(constants.CSV_NEW_EMP_NAME_FIRST_COL_NO)
@ -284,9 +237,6 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
if len(self.ta_cd) == 0: if len(self.ta_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_TA_CD_COL_NO])) self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_TA_CD_COL_NO]))
if len(self.emp_chg_type_cd) == 0:
error_list.append(self.make_require_error_message(
self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CHG_TYPE_CD_COL_NO]))
if len(self.emp_cd) == 0: if len(self.emp_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CD_COL_NO])) self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CD_COL_NO]))
@ -321,26 +271,6 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
は従業員マスタに存在しない もしくは 適用期間外のIDです') は従業員マスタに存在しない もしくは 適用期間外のIDです')
return error_list return error_list
def check_emp_chg_type_cd_exists(self) -> list[str]:
error_list = []
if not self.start_date or not self.emp_chg_type_cd:
return error_list
if is_not_empty(self.emp_chg_type_cd) and super().is_exist_emp_chg_type_cd(self.start_date) is False:
error_list.append(f'{self.line_num}行目の{constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CHG_TYPE_CD_COL_NO]}\
は汎用区分マスタに存在しない もしくは 適用期間外のコードです')
return error_list
def check_ta_cd_exists(self) -> list[str]:
error_list = []
if not self.start_date or not self.ta_cd:
return error_list
if is_not_empty(self.ta_cd) and super().is_exist_ta_cd(self.start_date) is False:
error_list.append(f'{self.line_num}行目の{constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_TA_CD_COL_NO]}\
は汎用区分マスタに存在しない もしくは 適用期間外のコードです')
return error_list
def check_bu_cd_exists(self) -> list[str]: def check_bu_cd_exists(self) -> list[str]:
error_list = [] error_list = []
@ -373,15 +303,7 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
def check_data_exists(self) -> list[str]: def check_data_exists(self) -> list[str]:
error_list = [] error_list = []
if super().emp_chg_inst_count(self.start_date) > 0: if super().emp_chg_inst_count(self.start_date) > 0:
error_list.append(super().make_data_exist_error_message( error_list.append(f'{self.line_num}行目の施設コード、領域コード、適用開始日がすべて同一のデータが既に登録されています。')
self.line_num,
primary_key_col_names=[
constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_INST_CD_COL_NO],
constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_TA_CD_COL_NO],
constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CHG_TYPE_CD_COL_NO],
constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_START_DATE]
]
))
return error_list return error_list
@ -407,8 +329,7 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository, emp_chginst_repository: EmpChgInstRepository
generic_kbn_mst_repository: GenericKbnMstRepository
): ):
super().__init__( super().__init__(
csv_row, csv_row,
@ -417,8 +338,7 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository, emp_chginst_repository
generic_kbn_mst_repository
) )
self.bu_cd = super().get_csv_value(constants.CSV_CHANGE_BU_CD_COL_NO) self.bu_cd = super().get_csv_value(constants.CSV_CHANGE_BU_CD_COL_NO)
self.bu_name = super().get_csv_value(constants.CSV_CHANGE_BU_NAME_COL_NO) self.bu_name = super().get_csv_value(constants.CSV_CHANGE_BU_NAME_COL_NO)
@ -428,7 +348,6 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
self.inst_name = super().get_csv_value(constants.CSV_CHANGE_INST_NAME_COL_NO) self.inst_name = super().get_csv_value(constants.CSV_CHANGE_INST_NAME_COL_NO)
self.ta_cd = super().get_csv_value(constants.CSV_CHANGE_TA_CD_COL_NO) self.ta_cd = super().get_csv_value(constants.CSV_CHANGE_TA_CD_COL_NO)
self.explain = super().get_csv_value(constants.CSV_CHANGE_EXPLAIN_COL_NO) self.explain = super().get_csv_value(constants.CSV_CHANGE_EXPLAIN_COL_NO)
self.emp_chg_type_cd = super().get_csv_value(constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO)
self.emp_cd = super().get_csv_value(constants.CSV_CHANGE_EMP_CD_COL_NO) self.emp_cd = super().get_csv_value(constants.CSV_CHANGE_EMP_CD_COL_NO)
self.emp_full_name = super().get_csv_value(constants.CSV_CHANGE_EMP_FULL_NAME_COL_NO) self.emp_full_name = super().get_csv_value(constants.CSV_CHANGE_EMP_FULL_NAME_COL_NO)
self.inst_emp_start_date = super().get_csv_value(constants.CSV_CHANGE_INST_EMP_START_DATE_COL_NO) self.inst_emp_start_date = super().get_csv_value(constants.CSV_CHANGE_INST_EMP_START_DATE_COL_NO)
@ -451,9 +370,6 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
if len(self.ta_cd) == 0: if len(self.ta_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO]))
if len(self.emp_chg_type_cd) == 0:
error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO]))
if len(self.emp_cd) == 0: if len(self.emp_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CD_COL_NO]))
@ -472,9 +388,6 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
if len(self.ta_cd) == 0: if len(self.ta_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO]))
if len(self.emp_chg_type_cd) == 0:
error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO]))
if len(self.inst_emp_start_date) == 0: if len(self.inst_emp_start_date) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, self.line_num,
@ -490,10 +403,6 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
if len(self.ta_cd) == 0: if len(self.ta_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO]))
if len(self.emp_chg_type_cd) == 0:
error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO]))
if len(self.emp_cd) == 0: if len(self.emp_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CD_COL_NO]))
@ -526,28 +435,6 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
は従業員マスタに存在しない もしくは 適用期間外のIDです') は従業員マスタに存在しない もしくは 適用期間外のIDです')
return error_list return error_list
def check_emp_chg_type_cd_exists(self) -> list[str]:
error_list = []
if not self.inst_emp_start_date or not self.emp_chg_type_cd:
return error_list
if is_not_empty(self.emp_chg_type_cd) and super().is_exist_emp_chg_type_cd(self.inst_emp_start_date) is False:
error_list.append(f'{self.line_num}行目の{constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO]}\
は汎用区分マスタに存在しない もしくは 適用期間外のコードです')
return error_list
def check_ta_cd_exists(self) -> list[str]:
error_list = []
if not self.inst_emp_start_date or not self.ta_cd:
return error_list
if is_not_empty(self.ta_cd) and super().is_exist_ta_cd(self.inst_emp_start_date) is False:
error_list.append(f'{self.line_num}行目の{constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO]}\
は汎用区分マスタに存在しない もしくは 適用期間外のコードです')
return error_list
def check_bu_cd_exists(self) -> list[str]: def check_bu_cd_exists(self) -> list[str]:
error_list = [] error_list = []
@ -597,26 +484,10 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
error_list = [] error_list = []
emp_chg_inst_count = super().emp_chg_inst_count(self.inst_emp_start_date) emp_chg_inst_count = super().emp_chg_inst_count(self.inst_emp_start_date)
if self.comment == '追加' and emp_chg_inst_count > 0: if self.comment == '追加' and emp_chg_inst_count > 0:
error_list.append(super().make_data_exist_error_message( error_list.append(f'{self.line_num}行目の施設コード、領域コード、施設担当_開始日がすべて同一のデータが既に登録されています。')
self.line_num,
primary_key_col_names=[
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_INST_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_INST_EMP_START_DATE_COL_NO]
]
))
elif (self.comment == '終了' or self.comment == '担当者修正') and emp_chg_inst_count == 0: elif (self.comment == '終了' or self.comment == '担当者修正') and emp_chg_inst_count == 0:
error_list.append(super().make_data_not_exist_error_message( error_list.append(f'{self.line_num}行目の施設コード、領域コード、施設担当_開始日がすべて同一のデータが存在しないため更新できません。')
self.line_num,
primary_key_col_names=[
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_INST_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_INST_EMP_START_DATE_COL_NO]
]
))
return error_list return error_list
@ -654,8 +525,7 @@ class MasterMainteCSVItems:
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository, emp_chginst_repository: EmpChgInstRepository
generic_kbn_mst_repository: GenericKbnMstRepository
) -> None: ) -> None:
reader = csv.reader(file) reader = csv.reader(file)
csv_rows = [] csv_rows = []
@ -670,9 +540,7 @@ class MasterMainteCSVItems:
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository, emp_chginst_repository))
generic_kbn_mst_repository
))
self.lines = csv_rows self.lines = csv_rows
def __select_function(self, def __select_function(self,
@ -683,8 +551,7 @@ class MasterMainteCSVItems:
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository, emp_chginst_repository: EmpChgInstRepository) -> MasterMainteCSVItem:
generic_kbn_mst_repository: GenericKbnMstRepository) -> MasterMainteCSVItem:
if function_type == 'new': if function_type == 'new':
return MasterMainteNewInstEmpCSVItem( return MasterMainteNewInstEmpCSVItem(
row, row,
@ -693,8 +560,7 @@ class MasterMainteCSVItems:
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository, emp_chginst_repository)
generic_kbn_mst_repository)
elif function_type == 'change': elif function_type == 'change':
return MasterMainteChangeInstEmpCSVItem( return MasterMainteChangeInstEmpCSVItem(
row, row,
@ -703,5 +569,4 @@ class MasterMainteCSVItems:
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository, emp_chginst_repository)
generic_kbn_mst_repository)

View File

@ -29,8 +29,8 @@ class MasterMainteEmpChgInstFunction(metaclass=ABCMeta):
def save(self): def save(self):
error_list = [] error_list = []
try: try:
self.emp_chginst_repository.begin()
self.emp_chginst_repository.to_jst() self.emp_chginst_repository.to_jst()
self.emp_chginst_repository.begin()
(result_message, error_list) = self.write_emp_chg_inst_table() (result_message, error_list) = self.write_emp_chg_inst_table()
if len(error_list) > 0: if len(error_list) > 0:
self.emp_chginst_repository.rollback() self.emp_chginst_repository.rollback()
@ -46,7 +46,6 @@ class MasterMainteEmpChgInstFunction(metaclass=ABCMeta):
self.emp_chginst_repository.insert_emp_chg_inst( self.emp_chginst_repository.insert_emp_chg_inst(
data['施設コード'], data['施設コード'],
data['領域コード'], data['領域コード'],
data['担当者種別コード'],
data['MUID'], data['MUID'],
data['ビジネスユニットコード'], data['ビジネスユニットコード'],
start_date, start_date,
@ -149,7 +148,6 @@ class ChangeEmpChgInstFunction(MasterMainteEmpChgInstFunction):
self.emp_chginst_repository.end_emp_chg_inst( self.emp_chginst_repository.end_emp_chg_inst(
data['施設コード'], data['施設コード'],
data['領域コード'], data['領域コード'],
data['担当者種別コード'],
data['施設担当_開始日'], data['施設担当_開始日'],
data['終了日の変更'], data['終了日の変更'],
self.user_name, self.user_name,
@ -160,7 +158,6 @@ class ChangeEmpChgInstFunction(MasterMainteEmpChgInstFunction):
data['施設コード'], data['施設コード'],
data['領域コード'], data['領域コード'],
data['施設担当_開始日'], data['施設担当_開始日'],
data['担当者種別コード'],
data['MUID'], data['MUID'],
self.user_name, self.user_name,
self.table_name) self.table_name)

View File

@ -2,22 +2,20 @@ from typing import Optional
from fastapi import Form from fastapi import Form
from src.model.request.request_base_model import RequestBaseModel
from src.util.sanitize import sanitize from src.util.sanitize import sanitize
from src.model.request.request_base_model import RequestBaseModel
from src.util.string_util import is_not_empty from src.util.string_util import is_not_empty
@sanitize @sanitize
class MasterMainteCsvDlModel(RequestBaseModel): class MasterMainteCsvDlModel(RequestBaseModel):
# adaptは検索に使用する値
ta_cd: Optional[str] ta_cd: Optional[str]
adapt_ta_cd: Optional[str] adapt_ta_cd: Optional[str]
inst_cd: Optional[str] inst_cd: Optional[str]
adapt_inst_cd: Optional[str] adapt_inst_cd: Optional[str]
emp_cd: Optional[str] emp_cd: Optional[str]
adapt_emp_cd: Optional[str] adapt_emp_cd: Optional[str]
emp_chg_type_cd: Optional[str]
adapt_emp_chg_type_cd: Optional[str]
apply_date_from: Optional[str] apply_date_from: Optional[str]
adapt_apply_date_from: Optional[str] adapt_apply_date_from: Optional[str]
start_date_from: Optional[str] start_date_from: Optional[str]
@ -44,7 +42,6 @@ class MasterMainteCsvDlModel(RequestBaseModel):
ctrl_ta_cd: Optional[str] = Form(None), ctrl_ta_cd: Optional[str] = Form(None),
ctrl_inst_cd: Optional[str] = Form(None), ctrl_inst_cd: Optional[str] = Form(None),
ctrl_emp_cd: Optional[str] = Form(None), ctrl_emp_cd: Optional[str] = Form(None),
ctrl_emp_chg_type_cd: Optional[str] = Form(None),
ctrl_apply_date_from: Optional[str] = Form(None), ctrl_apply_date_from: Optional[str] = Form(None),
ctrl_start_date_from: Optional[str] = Form(None), ctrl_start_date_from: Optional[str] = Form(None),
ctrl_start_date_to: Optional[str] = Form(None), ctrl_start_date_to: Optional[str] = Form(None),
@ -61,7 +58,6 @@ class MasterMainteCsvDlModel(RequestBaseModel):
ctrl_ta_cd, ctrl_ta_cd,
ctrl_inst_cd, ctrl_inst_cd,
ctrl_emp_cd, ctrl_emp_cd,
ctrl_emp_chg_type_cd,
ctrl_apply_date_from, ctrl_apply_date_from,
ctrl_start_date_from, ctrl_start_date_from,
ctrl_start_date_to, ctrl_start_date_to,
@ -79,7 +75,6 @@ class MasterMainteCsvDlModel(RequestBaseModel):
ctrl_ta_cd: str, ctrl_ta_cd: str,
ctrl_inst_cd: str, ctrl_inst_cd: str,
ctrl_emp_cd: str, ctrl_emp_cd: str,
ctrl_emp_chg_type_cd,
ctrl_apply_date_from: str, ctrl_apply_date_from: str,
ctrl_start_date_from: str, ctrl_start_date_from: str,
ctrl_start_date_to: str, ctrl_start_date_to: str,
@ -94,7 +89,6 @@ class MasterMainteCsvDlModel(RequestBaseModel):
ctrl_ta_cd = ctrl_ta_cd if is_not_empty(ctrl_ta_cd) else '' ctrl_ta_cd = ctrl_ta_cd if is_not_empty(ctrl_ta_cd) else ''
ctrl_inst_cd = ctrl_inst_cd if is_not_empty(ctrl_inst_cd) else '' ctrl_inst_cd = ctrl_inst_cd if is_not_empty(ctrl_inst_cd) else ''
ctrl_emp_cd = ctrl_emp_cd if is_not_empty(ctrl_emp_cd) else '' ctrl_emp_cd = ctrl_emp_cd if is_not_empty(ctrl_emp_cd) else ''
ctrl_emp_chg_type_cd = ctrl_emp_chg_type_cd if is_not_empty(ctrl_emp_chg_type_cd) else ''
adapt_apply_date_from = '' adapt_apply_date_from = ''
if is_not_empty(ctrl_apply_date_from): if is_not_empty(ctrl_apply_date_from):
@ -153,8 +147,6 @@ class MasterMainteCsvDlModel(RequestBaseModel):
adapt_inst_cd=ctrl_inst_cd, adapt_inst_cd=ctrl_inst_cd,
emp_cd=ctrl_emp_cd, emp_cd=ctrl_emp_cd,
adapt_emp_cd=ctrl_emp_cd, adapt_emp_cd=ctrl_emp_cd,
emp_chg_type_cd=ctrl_emp_chg_type_cd,
adapt_emp_chg_type_cd=ctrl_emp_chg_type_cd,
apply_date_from=ctrl_apply_date_from, apply_date_from=ctrl_apply_date_from,
adapt_apply_date_from=adapt_apply_date_from, adapt_apply_date_from=adapt_apply_date_from,
start_date_from=ctrl_start_date_from, start_date_from=ctrl_start_date_from,

View File

@ -9,7 +9,6 @@ class InstEmpCsvDownloadViewModel(BaseModel):
ta_cd: str = '' ta_cd: str = ''
inst_cd: str = '' inst_cd: str = ''
emp_cd: str = '' emp_cd: str = ''
emp_chg_type_cd: str = ''
apply_date_from: str = '' apply_date_from: str = ''
start_date_from: str = '' start_date_from: str = ''
start_date_to: str = '' start_date_to: str = ''

View File

@ -1,10 +1,10 @@
from src.db import sql_condition as condition from src.repositories.base_repository import BaseRepository
from src.db.sql_condition import SQLCondition from src.db.sql_condition import SQLCondition
from src.logging.get_logger import get_logger from src.db import sql_condition as condition
from src.model.db.master_mente_count import MasterMenteCountModel from src.model.db.master_mente_count import MasterMenteCountModel
from src.model.request.master_mainte_csvdl import MasterMainteCsvDlModel from src.model.request.master_mainte_csvdl import MasterMainteCsvDlModel
from src.repositories.base_repository import BaseRepository
from src.util.string_util import is_not_empty from src.util.string_util import is_not_empty
from src.logging.get_logger import get_logger
logger = get_logger('従業員担当施設マスタ') logger = get_logger('従業員担当施設マスタ')
@ -28,7 +28,6 @@ class EmpChgInstRepository(BaseRepository):
( (
inst_cd, inst_cd,
ta_cd, ta_cd,
emp_chg_type_cd,
emp_cd, emp_cd,
bu_cd, bu_cd,
start_date, start_date,
@ -43,7 +42,6 @@ class EmpChgInstRepository(BaseRepository):
VALUES ( VALUES (
:inst_cd, :inst_cd,
:ta_cd, :ta_cd,
:emp_chg_type_cd,
:emp_cd, :emp_cd,
:bu_cd, :bu_cd,
:start_date, :start_date,
@ -57,14 +55,13 @@ class EmpChgInstRepository(BaseRepository):
) )
""" """
def insert_emp_chg_inst(self, inst_cd, ta_cd, emp_chg_type_cd, emp_cd, bu_cd, start_date, def insert_emp_chg_inst(self, inst_cd, ta_cd, emp_cd, bu_cd, start_date,
end_date, create_user_name, table_name): end_date, create_user_name, table_name):
try: try:
query = self.INSERT_SQL.format(table_name=table_name) query = self.INSERT_SQL.format(table_name=table_name)
self._database.execute(query, { self._database.execute(query, {
'inst_cd': inst_cd, 'inst_cd': inst_cd,
'ta_cd': ta_cd, 'ta_cd': ta_cd,
'emp_chg_type_cd': emp_chg_type_cd,
'emp_cd': emp_cd, 'emp_cd': emp_cd,
'bu_cd': bu_cd, 'bu_cd': bu_cd,
'start_date': start_date, 'start_date': start_date,
@ -85,19 +82,17 @@ class EmpChgInstRepository(BaseRepository):
update_date = NOW() update_date = NOW()
WHERE WHERE
inst_cd = :inst_cd inst_cd = :inst_cd
AND ta_cd = :ta_cd and ta_cd = :ta_cd
AND emp_chg_type_cd = :emp_chg_type_cd and start_date = :start_date
AND start_date = :start_date
""" """
def end_emp_chg_inst(self, inst_cd, ta_cd, emp_chg_type_cd, start_date, def end_emp_chg_inst(self, inst_cd, ta_cd, start_date,
end_date, update_user_name, table_name): end_date, update_user_name, table_name):
try: try:
query = self.UPDATE_END_DATE_SQL.format(table_name=table_name) query = self.UPDATE_END_DATE_SQL.format(table_name=table_name)
self._database.execute(query, { self._database.execute(query, {
'inst_cd': inst_cd, 'inst_cd': inst_cd,
'ta_cd': ta_cd, 'ta_cd': ta_cd,
'emp_chg_type_cd': emp_chg_type_cd,
'start_date': start_date, 'start_date': start_date,
'end_date': end_date, 'end_date': end_date,
'update_user_name': update_user_name 'update_user_name': update_user_name
@ -115,18 +110,16 @@ class EmpChgInstRepository(BaseRepository):
update_date = NOW() update_date = NOW()
where where
inst_cd = :inst_cd inst_cd = :inst_cd
AND ta_cd = :ta_cd and ta_cd = :ta_cd
AND emp_chg_type_cd = :emp_chg_type_cd and start_date = :start_date
AND start_date = :start_date
""" """
def modify_emp_chg_inst(self, inst_cd, ta_cd, start_date, emp_chg_type_cd, emp_cd, update_user_name, table_name): def modify_emp_chg_inst(self, inst_cd, ta_cd, start_date, emp_cd, update_user_name, table_name):
try: try:
query = self.UPDATE_EMP_CD_SQL.format(table_name=table_name) query = self.UPDATE_EMP_CD_SQL.format(table_name=table_name)
self._database.execute(query, { self._database.execute(query, {
'inst_cd': inst_cd, 'inst_cd': inst_cd,
'ta_cd': ta_cd, 'ta_cd': ta_cd,
'emp_chg_type_cd': emp_chg_type_cd,
'start_date': start_date, 'start_date': start_date,
'emp_cd': emp_cd, 'emp_cd': emp_cd,
'update_user_name': update_user_name 'update_user_name': update_user_name
@ -143,15 +136,14 @@ class EmpChgInstRepository(BaseRepository):
WHERE WHERE
inst_cd = :inst_cd inst_cd = :inst_cd
AND ta_cd = :ta_cd AND ta_cd = :ta_cd
AND emp_chg_type_cd = :emp_chg_type_cd
AND start_date = :start_date AND start_date = :start_date
""" """
def fetch_count(self, inst_cd, ta_cd, emp_chg_type_cd, start_date, table_name) -> MasterMenteCountModel: def fetch_count(self, inst_cd, ta_cd, start_date, table_name) -> MasterMenteCountModel:
try: try:
query = self.FETCH_COUNT_SQL.format(table_name=table_name) query = self.FETCH_COUNT_SQL.format(table_name=table_name)
result = self._database.execute_select(query, {'inst_cd': inst_cd, 'ta_cd': ta_cd, result = self._database.execute_select(query, {'inst_cd': inst_cd, 'ta_cd': ta_cd,
'emp_chg_type_cd': emp_chg_type_cd, 'start_date': start_date}) 'start_date': start_date})
models = [MasterMenteCountModel(**r) for r in result] models = [MasterMenteCountModel(**r) for r in result]
if len(models) == 0: if len(models) == 0:
return 0 return 0
@ -165,7 +157,6 @@ class EmpChgInstRepository(BaseRepository):
eci.inst_cd AS inst_cd, eci.inst_cd AS inst_cd,
mi.inst_name AS inst_name, mi.inst_name AS inst_name,
eci.ta_cd AS ta_cd, eci.ta_cd AS ta_cd,
eci.emp_chg_type_cd AS emp_chg_type_cd,
eci.emp_cd AS emp_cd, eci.emp_cd AS emp_cd,
CONCAT(emp.emp_name_family, " ", emp.emp_name_first) AS emp_name_full, CONCAT(emp.emp_name_family, " ", emp.emp_name_first) AS emp_name_full,
eci.bu_cd AS bu_cd, eci.bu_cd AS bu_cd,
@ -221,11 +212,6 @@ class EmpChgInstRepository(BaseRepository):
parameter.adapt_emp_cd = f'%{parameter.emp_cd}%' parameter.adapt_emp_cd = f'%{parameter.emp_cd}%'
where_clauses.append(SQLCondition('eci.emp_cd', condition.LIKE, 'adapt_emp_cd')) where_clauses.append(SQLCondition('eci.emp_cd', condition.LIKE, 'adapt_emp_cd'))
# 担当者種別コードが入力されていた場合
if is_not_empty(parameter.emp_chg_type_cd):
parameter.adapt_emp_chg_type_cd = f'%{parameter.emp_chg_type_cd}%'
where_clauses.append(SQLCondition('eci.emp_chg_type_cd', condition.LIKE, 'adapt_emp_chg_type_cd'))
# 適用期間内が入力されていた場合 # 適用期間内が入力されていた場合
if is_not_empty(parameter.adapt_apply_date_from): if is_not_empty(parameter.adapt_apply_date_from):
where_clauses.append(SQLCondition('eci.start_date', condition.LE, 'adapt_apply_date_from')) where_clauses.append(SQLCondition('eci.start_date', condition.LE, 'adapt_apply_date_from'))

View File

@ -1,33 +0,0 @@
from src.repositories.base_repository import BaseRepository
from src.model.db.master_mente_count import MasterMenteCountModel
from src.logging.get_logger import get_logger
logger = get_logger('汎用区分マスタ')
class GenericKbnMstRepository(BaseRepository):
FETCH_SQL = """\
SELECT
COUNT(*) AS count
FROM
src05.generic_kbn_mst
WHERE
generic_kbn_mst.generic_kbn_cd = :generic_kbn_cd
AND
generic_kbn_mst.kbn_cd = :kbn_cd
AND
STR_TO_DATE( :start_date , '%Y%m%d') BETWEEN generic_kbn_mst.start_date AND generic_kbn_mst.end_date\
"""
def fetch_count(self, generic_kbn_cd: str, kbn_cd: str, start_date: str) -> MasterMenteCountModel:
try:
query = self.FETCH_SQL
result = self._database.execute_select(query, {'generic_kbn_cd': generic_kbn_cd, 'kbn_cd': kbn_cd, 'start_date' : start_date})
models = [MasterMenteCountModel(**r) for r in result]
if len(models) == 0:
return 0
return models[0].count
except Exception as e:
logger.error(f"DB Error : Exception={e.args}")
raise e

View File

@ -6,19 +6,6 @@ logger = get_logger('ユーザー取得')
class UserMasterRepository(BaseRepository): class UserMasterRepository(BaseRepository):
def to_jst(self):
self._database.to_jst()
def begin(self):
self._database.begin()
def commit(self):
self._database.commit()
def rollback(self):
self._database.rollback()
FETCH_SQL = """\ FETCH_SQL = """\
SELECT SELECT
* *
@ -39,46 +26,3 @@ class UserMasterRepository(BaseRepository):
except Exception as e: except Exception as e:
logger.exception(f"DB Error : Exception={e}") logger.exception(f"DB Error : Exception={e}")
raise e raise e
def increase_login_failed_count(self, parameter: dict) -> UserMasterModel:
try:
query = """\
UPDATE
src05.user_mst
SET
mntuser_login_failed_cnt =
CASE
WHEN
DATE(mntuser_last_login_failed_datetime) = DATE(CURRENT_TIMESTAMP())
THEN
mntuser_login_failed_cnt + 1
ELSE
1
END,
mntuser_last_login_failed_datetime = CURRENT_TIMESTAMP()
WHERE
user_id = :user_id
AND
mntuser_flg = 1;\
"""
self._database.execute(query, parameter)
except Exception as e:
logger.exception(f"DB Error : Exception={e}")
raise e
def disable_mnt_user(self, parameter: dict) -> UserMasterModel:
try:
query = """\
UPDATE
src05.user_mst
SET
enabled_flg = 'N'
WHERE
user_id = :user_id
AND
mntuser_flg = 1\
"""
self._database.execute(query, parameter)
except Exception as e:
logger.exception(f"DB Error : Exception={e}")
raise e

View File

@ -103,7 +103,6 @@ class AfterSetCookieSessionRoute(MeDaCaRoute):
"""事後処理として、セッションキーをcookieに設定するカスタムルートハンドラー""" """事後処理として、セッションキーをcookieに設定するカスタムルートハンドラー"""
async def post_process_route(self, request: Request, response: Response): async def post_process_route(self, request: Request, response: Response):
response = await super().post_process_route(request, response) response = await super().post_process_route(request, response)
session_key = response.headers.get('session_key', None) session_key = response.headers.get('session_key', None)
# セッションキーがない場合はセットせずに返す # セッションキーがない場合はセットせずに返す
if session_key is None: if session_key is None:
@ -124,6 +123,7 @@ class AfterSetCookieSessionRoute(MeDaCaRoute):
response.set_cookie( response.set_cookie(
key='session', key='session',
value=session_key, value=session_key,
max_age=environment.SESSION_EXPIRE_MINUTE * 60, # cookieの有効期限は秒数指定なので、60秒をかける
secure=True, secure=True,
httponly=True httponly=True
) )

View File

@ -49,27 +49,6 @@ class LoginService(BaseService):
user_record: UserMasterModel = self.user_repository.fetch_one({'user_id': user_id}) user_record: UserMasterModel = self.user_repository.fetch_one({'user_id': user_id})
return user_record return user_record
def increase_login_failed_count(self, user_id: str):
try:
# セッション内のタイムゾーン変更のため、明示的にトランザクションを開始する
self.user_repository.begin()
self.user_repository.to_jst()
self.user_repository.increase_login_failed_count({'user_id': user_id})
self.user_repository.commit()
except Exception as e:
self.user_repository.rollback()
raise e
def on_login_fail_limit_exceeded(self, user_id: str):
self.user_repository.disable_mnt_user({'user_id': user_id})
def is_login_failed_limit_exceeded(self, user_id: str):
user_record: UserMasterModel = self.user_repository.fetch_one({'user_id': user_id})
if user_record is None:
return False
return user_record.is_login_failed_limit_exceeded()
def __secret_hash(self, username: str): def __secret_hash(self, username: str):
# see - https://aws.amazon.com/jp/premiumsupport/knowledge-center/cognito-unable-to-verify-secret-hash/ # noqa # see - https://aws.amazon.com/jp/premiumsupport/knowledge-center/cognito-unable-to-verify-secret-hash/ # noqa
message = bytes(username + environment.COGNITO_CLIENT_ID, 'utf-8') message = bytes(username + environment.COGNITO_CLIENT_ID, 'utf-8')

View File

@ -19,7 +19,6 @@ from src.repositories.mst_inst_repository import MstInstRepository
from src.repositories.bu_master_cd_repository import BuMasterRepository from src.repositories.bu_master_cd_repository import BuMasterRepository
from src.repositories.emp_master_repository import EmpMasterRepository from src.repositories.emp_master_repository import EmpMasterRepository
from src.repositories.emp_chg_inst_repository import EmpChgInstRepository from src.repositories.emp_chg_inst_repository import EmpChgInstRepository
from src.repositories.generic_kbn_mst_repository import GenericKbnMstRepository
from src.model.internal.master_mainte_csv import MasterMainteCSVItems from src.model.internal.master_mainte_csv import MasterMainteCSVItems
from src.model.internal.master_mainte_emp_chg_inst_function import NewEmpChgInstFunction from src.model.internal.master_mainte_emp_chg_inst_function import NewEmpChgInstFunction
from src.model.internal.master_mainte_emp_chg_inst_function import ChangeEmpChgInstFunction from src.model.internal.master_mainte_emp_chg_inst_function import ChangeEmpChgInstFunction
@ -39,7 +38,6 @@ class MasterMainteService(BaseService):
'emp_master_repository': EmpMasterRepository, 'emp_master_repository': EmpMasterRepository,
'bu_master_repository': BuMasterRepository, 'bu_master_repository': BuMasterRepository,
'emp_chginst_repository': EmpChgInstRepository, 'emp_chginst_repository': EmpChgInstRepository,
'generic_kbn_mst_repository': GenericKbnMstRepository,
} }
CLIENTS = { CLIENTS = {
@ -50,7 +48,6 @@ class MasterMainteService(BaseService):
emp_master_repository: EmpMasterRepository emp_master_repository: EmpMasterRepository
bu_master_repository: BuMasterRepository bu_master_repository: BuMasterRepository
emp_chginst_repository: EmpChgInstRepository emp_chginst_repository: EmpChgInstRepository
generic_kbn_mst_repository: GenericKbnMstRepository
s3_client: S3Client s3_client: S3Client
def __init__(self, repositories: dict[str, BaseRepository], clients: dict[str, AWSAPIClient]) -> None: def __init__(self, repositories: dict[str, BaseRepository], clients: dict[str, AWSAPIClient]) -> None:
@ -59,7 +56,6 @@ class MasterMainteService(BaseService):
self.emp_master_repository = repositories['emp_master_repository'] self.emp_master_repository = repositories['emp_master_repository']
self.bu_master_repository = repositories['bu_master_repository'] self.bu_master_repository = repositories['bu_master_repository']
self.emp_chginst_repository = repositories['emp_chginst_repository'] self.emp_chginst_repository = repositories['emp_chginst_repository']
self.generic_kbn_mst_repository = repositories['generic_kbn_mst_repository']
self.s3_client = clients['s3_client'] self.s3_client = clients['s3_client']
def prepare_mainte_csv_up_view(self, def prepare_mainte_csv_up_view(self,
@ -81,8 +77,7 @@ class MasterMainteService(BaseService):
self.mst_inst_repository, self.mst_inst_repository,
self.emp_master_repository, self.emp_master_repository,
self.bu_master_repository, self.bu_master_repository,
self.emp_chginst_repository, self.emp_chginst_repository
self.generic_kbn_mst_repository
) )
error_message_list = [] error_message_list = []
@ -153,8 +148,8 @@ class MasterMainteService(BaseService):
def copy_data_real_to_dummy(self) -> TableOverrideViewModel: def copy_data_real_to_dummy(self) -> TableOverrideViewModel:
try: try:
self.emp_chginst_repository.begin()
self.emp_chginst_repository.to_jst() self.emp_chginst_repository.to_jst()
self.emp_chginst_repository.begin()
self.emp_chginst_repository.delete_dummy_table() self.emp_chginst_repository.delete_dummy_table()
self.emp_chginst_repository.copy_real_to_dummy() self.emp_chginst_repository.copy_real_to_dummy()
self.emp_chginst_repository.commit() self.emp_chginst_repository.commit()

View File

@ -17,10 +17,3 @@ def get_session(key: str) -> UserSession:
except UserSession.DoesNotExist as e: except UserSession.DoesNotExist as e:
logger.debug(f'セッション取得失敗:{e}') logger.debug(f'セッション取得失敗:{e}')
return None return None
def delete_session (session: UserSession):
try:
session.delete()
return
except:
return

View File

@ -63,7 +63,6 @@ LOGOUT_REASON_BACKUP_PROCESSING = 'dump_processing'
LOGOUT_REASON_NOT_LOGIN = 'not_login' LOGOUT_REASON_NOT_LOGIN = 'not_login'
LOGOUT_REASON_DB_ERROR = 'db_error' LOGOUT_REASON_DB_ERROR = 'db_error'
LOGOUT_REASON_UNEXPECTED = 'unexpected' LOGOUT_REASON_UNEXPECTED = 'unexpected'
LOGOUT_REASON_LOGIN_FAILED_LIMIT_EXCEEDED = 'login_failed_limit_exceeded'
LOGOUT_REASON_MESSAGE_MAP = { LOGOUT_REASON_MESSAGE_MAP = {
LOGOUT_REASON_DO_LOGOUT: 'Logoutしました。', LOGOUT_REASON_DO_LOGOUT: 'Logoutしました。',
@ -73,8 +72,7 @@ LOGOUT_REASON_MESSAGE_MAP = {
LOGOUT_REASON_BACKUP_PROCESSING: 'バックアップ取得を開始しました。<br>日次バッチ更新が終了するまでマスターメンテは使用できません', LOGOUT_REASON_BACKUP_PROCESSING: 'バックアップ取得を開始しました。<br>日次バッチ更新が終了するまでマスターメンテは使用できません',
LOGOUT_REASON_NOT_LOGIN: 'Loginしてからページにアクセスしてください。', LOGOUT_REASON_NOT_LOGIN: 'Loginしてからページにアクセスしてください。',
LOGOUT_REASON_DB_ERROR: 'DB接続に失敗しました。<br>再度Loginするか、<br>管理者にお問い合わせください。', LOGOUT_REASON_DB_ERROR: 'DB接続に失敗しました。<br>再度Loginするか、<br>管理者にお問い合わせください。',
LOGOUT_REASON_UNEXPECTED: '予期しないエラーが発生しました。<br>再度Loginするか、<br>管理者に問い合わせてください。', LOGOUT_REASON_UNEXPECTED: '予期しないエラーが発生しました。<br>再度Loginするか、<br>管理者に問い合わせてください。'
LOGOUT_REASON_LOGIN_FAILED_LIMIT_EXCEEDED: 'ログイン失敗回数の上限を超えましたので<br>アカウントをロックしました。<br>管理者に連絡してください'
} }
# 新規施設担当者登録CSV(マスターメンテ) # 新規施設担当者登録CSV(マスターメンテ)
@ -82,7 +80,6 @@ NEW_INST_EMP_CSV_LOGICAL_NAMES = [
'施設コード', '施設コード',
'施設名', '施設名',
'領域コード', '領域コード',
'担当者種別コード',
'MUID', 'MUID',
'担当者名(姓)', '担当者名(姓)',
'担当者名(名)', '担当者名(名)',
@ -96,20 +93,18 @@ CSV_NEW_INST_CD_COL_NO = 0
CSV_NEW_INST_NAME_COL_NO = 1 CSV_NEW_INST_NAME_COL_NO = 1
# 領域コードの列No # 領域コードの列No
CSV_NEW_TA_CD_COL_NO = 2 CSV_NEW_TA_CD_COL_NO = 2
# 担当者種別コードの列No
CSV_NEW_EMP_CHG_TYPE_CD_COL_NO = 3
# MUIDの列No # MUIDの列No
CSV_NEW_EMP_CD_COL_NO = 4 CSV_NEW_EMP_CD_COL_NO = 3
# 担当者名の列No # 担当者名の列No
CSV_NEW_EMP_NAME_FAMILY_COL_NO = 5 CSV_NEW_EMP_NAME_FAMILY_COL_NO = 4
# 担当者名の列No # 担当者名の列No
CSV_NEW_EMP_NAME_FIRST_COL_NO = 6 CSV_NEW_EMP_NAME_FIRST_COL_NO = 5
# ビジネスユニットコードの列No # ビジネスユニットコードの列No
CSV_NEW_BU_CD_COL_NO = 7 CSV_NEW_BU_CD_COL_NO = 6
# 適用開始日の列No # 適用開始日の列No
CSV_NEW_START_DATE = 8 CSV_NEW_START_DATE = 7
# 適用終了日の列No # 適用終了日の列No
CSV_NEW_END_DATE = 9 CSV_NEW_END_DATE = 8
# 施設担当者変更登録CSV(マスターメンテ) # 施設担当者変更登録CSV(マスターメンテ)
CHANGE_INST_CSV_LOGICAL_NAMES = [ CHANGE_INST_CSV_LOGICAL_NAMES = [
@ -121,7 +116,6 @@ CHANGE_INST_CSV_LOGICAL_NAMES = [
'施設名', '施設名',
'領域コード', '領域コード',
'説明', '説明',
'担当者種別コード',
'MUID', 'MUID',
'担当者名', '担当者名',
'施設担当_開始日', '施設担当_開始日',
@ -145,20 +139,18 @@ CSV_CHANGE_INST_NAME_COL_NO = 5
CSV_CHANGE_TA_CD_COL_NO = 6 CSV_CHANGE_TA_CD_COL_NO = 6
# 説明の列No # 説明の列No
CSV_CHANGE_EXPLAIN_COL_NO = 7 CSV_CHANGE_EXPLAIN_COL_NO = 7
# 担当者種別コード
CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO = 8
# MUIDの列No # MUIDの列No
CSV_CHANGE_EMP_CD_COL_NO = 9 CSV_CHANGE_EMP_CD_COL_NO = 8
# 担当者名の列No # 担当者名の列No
CSV_CHANGE_EMP_FULL_NAME_COL_NO = 10 CSV_CHANGE_EMP_FULL_NAME_COL_NO = 9
# 施設担当_開始日の列No # 施設担当_開始日の列No
CSV_CHANGE_INST_EMP_START_DATE_COL_NO = 11 CSV_CHANGE_INST_EMP_START_DATE_COL_NO = 10
# 施設担当_終了日の列No # 施設担当_終了日の列No
CSV_CHANGE_INST_EMP_END_DATE_COL_NO = 12 CSV_CHANGE_INST_EMP_END_DATE_COL_NO = 11
# 終了日の変更の列No # 終了日の変更の列No
CSV_CHANGE_CHANGE_END_DATE_COL_NO = 13 CSV_CHANGE_CHANGE_END_DATE_COL_NO = 12
# コメントの列No # コメントの列No
CSV_CHANGE_COMMENT = 14 CSV_CHANGE_COMMENT = 13
# CSVアップロードテーブル名(マスターメンテ) # CSVアップロードテーブル名(マスターメンテ)
CSV_REAL_TABLE_NAME = '本番テーブル' CSV_REAL_TABLE_NAME = '本番テーブル'
@ -170,7 +162,6 @@ MENTE_CSV_DOWNLOAD_EXTRACT_COLUMNS = [
'inst_cd', 'inst_cd',
'inst_name', 'inst_name',
'ta_cd', 'ta_cd',
'emp_chg_type_cd',
'emp_cd', 'emp_cd',
'emp_name_full', 'emp_name_full',
'bu_cd', 'bu_cd',
@ -187,7 +178,6 @@ MENTE_CSV_DOWNLOAD_HEADER = [
'施設コード', '施設コード',
'施設名', '施設名',
'領域コード', '領域コード',
'担当者種別コード',
'MUID', 'MUID',
'担当者名', '担当者名',
'ビジネスユニットコード', 'ビジネスユニットコード',
@ -217,6 +207,3 @@ DISPLAY_USER_STOP_DIV_SHORT = {
'03': '特定項目停止', '03': '特定項目停止',
'04': '全DM停止' '04': '全DM停止'
} }
# ログイン失敗回数上限(保守ユーザー)
LOGIN_FAIL_LIMIT = 10

View File

@ -1,19 +1,19 @@
<meta charset="utf-8"> <meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="viewport" content="width=device-width, initial-scale=1">
<meta http-equiv="content-type" content="text/html; charset=utf-8" /> <meta name="format-detection" content="telephone=no, address=no" http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="format-detection" content="telephone=no, address=no" />
<title>{{subtitle}}</title> <title>{{subtitle}}</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/css/bootstrap.min.css" crossorigin="anonymous"> <link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/css/bootstrap.min.css" integrity="sha384-GLhlTQ8iRABdZLl6O3oVMWSktQOp6b7In1Zl3/Jr59b6EGGoI1aFkw7cmDA6j6gD" crossorigin="anonymous">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/flatpickr/dist/flatpickr.min.css" crossorigin="anonymous"> <link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.10.2/font/bootstrap-icons.css" integrity="sha384-b6lVK+yci+bfDmaY1u0zE8YYJt0TZxLEAFyYSLHId4xoVvsrQu3INevFKo+Xir8e" crossorigin="anonymous">
<link rel="stylesheet" href="/static/css/main_theme.css" integrity="sha384-k0YpJBvcGJXdJlt8yqnhPYuU7tHQfdv4C80KDdGf72dzzAVWUp+ek+A1cqOV5o4t"> <link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/flatpickr/dist/flatpickr.min.css">
<link rel="stylesheet" href="/static/css/pagenation.css" integrity="sha384-CDhOHftwvzWdI3cmvl0PESIdU5i0qjWbz8+HE9poJscglyrB0jzXZpVkb51xigty"> <link rel="stylesheet" href="/static/css/main_theme.css">
<link rel="stylesheet" href="/static/css/datepicker.css" integrity="sha384-I3gPqeqj0wDLoF6oS/OuMJ5C+BI210zLrJvQvNRVdvyyI9+qrraaQK2L9vvhTA8x"> <link rel="stylesheet" href="/static/css/pagenation.css">
<link rel="stylesheet" href="/static/css/loading.css" integrity="sha384-f9FRohCbLarb6Z91FWRbfNIIYYLx/5Kxqw19CB9Z0GxXunS9j0gRWWl50LayDAG7"> <link rel="stylesheet" href="/static/css/datepicker.css">
<script src="https://cdn.jsdelivr.net/npm/jquery@3.6.3/dist/jquery.min.js" crossorigin="anonymous"></script> <link rel="stylesheet" href="/static/css/loading.css">
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/js/bootstrap.bundle.min.js" crossorigin="anonymous"></script> <script src="https://code.jquery.com/jquery-3.6.3.min.js" integrity="sha256-pvPw+upLPUjgMXY0G+8O0xUf+/Im1MZjXxxgOcBQBXU=" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/paginationjs@2.5.0/dist/pagination.min.js" crossorigin="anonymous"></script> <script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/js/bootstrap.bundle.min.js" integrity="sha384-w76AqPfDkMBDXo30jS1Sgez6pr3x5MlQ1ZAGC+nuZB+EYdgRZgiwxhTBTkF7CXvN" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/flatpickr@4.6.13/dist/flatpickr.min.js" crossorigin="anonymous"></script> <script src="https://pagination.js.org/dist/2.5.0/pagination.min.js" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/flatpickr/dist/l10n/ja.min.js" crossorigin="anonymous"></script> <script src="https://cdn.jsdelivr.net/npm/flatpickr@4.6.13/dist/flatpickr.min.js"></script>
<script src="/static/function/businessLogicScript.js" integrity="sha384-ytd1o7Rx4BPzjO3RpzR9fW/Z4avGzS7+BRPZVUsQp5X4zXB6xdZpR47/En1mNl7s" crossorigin="anonymous"></script> <script src="https://cdn.jsdelivr.net/npm/flatpickr/dist/l10n/ja.min.js"></script>
<script src="/static/lib/fixed_midashi.js" integrity="sha384-mCd6L3DNaLgUWyH051BywJfzlVavCkK6F0wbMqG+j7jAq174Uf7HJdq3H4wxCJKs" crossorigin="anonymous"></script> <script src="/static/function/businessLogicScript.js"></script>
<script src="/static/lib/fixed_midashi.js"></script>

View File

@ -81,17 +81,6 @@
</td> </td>
</tr> </tr>
<!-- 検索フォーム2行目 --> <!-- 検索フォーム2行目 -->
<tr>
<!-- 担当者種別コード -->
<td class="searchLabelTd">担当者種別コード:</td>
<td class="searchInputTd">
<input class="searchTextbox" type="text" name="ctrl_emp_chg_type_cd" value="{{mainte_csv_dl.emp_chg_type_cd | safe}}" maxlength='10'
onchange="formBtDisabled()"
oninput="formBtDisabled()"
>
</td>
</tr>
<!-- 検索フォーム3行目 -->
<tr> <tr>
<!-- 適用期間内 --> <!-- 適用期間内 -->
<td class="searchLabelTd">適用期間内:</td> <td class="searchLabelTd">適用期間内:</td>
@ -128,7 +117,7 @@
> >
</td> </td>
</tr> </tr>
<!-- 検索フォーム4行目 --> <!-- 検索フォーム3行目 -->
<tr> <tr>
<!-- 対象テーブル --> <!-- 対象テーブル -->
<td class="searchLabelTd">対象テーブル:</td> <td class="searchLabelTd">対象テーブル:</td>
@ -171,7 +160,7 @@
> >
</td> </td>
</tr> </tr>
<!-- 検索フォーム5行目 --> <!-- 検索フォーム4行目 -->
<tr> <tr>
<!-- 検索、クリアボタン --> <!-- 検索、クリアボタン -->
<td class="searchButtonTd" colspan="6"> <td class="searchButtonTd" colspan="6">

View File

@ -15,8 +15,61 @@
{{logout.reason}} {{logout.reason}}
{% endautoescape %} {% endautoescape %}
</p> </p>
<!-- <?php
// getが来ておらず理由がわからない場合
if(!(isset($_GET['reason']))){
$userflg = null;
// ログアウトボタンを押されたとき
} else if($_GET['reason'] == 'logoutBtn'){
?>
<p class="logout_p"><?php echo $logoutMsg ?></p>
<?php
// ログイン失敗時に表示
} else if($_GET['reason'] == 'loginErr'){
?>
<p class="logout_p"><?php echo $loginErrMsg ?></p>
<?php
// 日時バッチ処理中エラー時
} else if($_GET['reason'] == 'batchProcess'){
?>
<p class="logout_p"><?php echo $batchProcessMsg ?></p>
<?php
// マスターメンテ日時バッチ処理中エラー時
} else if($_GET['reason'] == 'batchProcessNewInstEmpRegist'){
?>
<p class="logout_p"><?php echo $batchProcessNewInstEmpRegistMsg ?></p>
<?php
// どっちのユーザーでログインしたかわからないとき
} else if (!(isset($userflg))) {
} else {
$userflg = null;
?>
<p class="logout_p"><?php echo $unexpectedErrMsg ?></p>
<?php
}
?> -->
<br><br><br> <br><br><br>
<p class="logout_p"><a href="{{ logout.redirect_to }}">{{logout.link_text}}</a></p> <p class="logout_p"><a href="{{ logout.redirect_to }}">{{logout.link_text}}</a></p>
<!-- MeDaCA機能メニューへ -->
<!-- <p class="logout_p"><a href="redirect_to">Login画面に戻る</a></p> -->
<!-- <?php
if (!(isset($userflg))) {
?>
<p class="logout_p"><a href="<?php echo $groupwarePath ?>"><?php echo $groupwareBackMsg ?></a></p>
<?php
} else if($userflg == 1){
?>
<p class="logout_p"><a href="<?php echo $maintLoginPath ?>"><?php echo $loginBackMsg ?></a></p>
<?php
} else {
?>
<p class="logout_p"><a href="<?php echo $groupwarePath ?>"><?php echo $groupwareBackMsg ?></a></p>
<?php
}
?> -->
</div> </div>
</body> </body>
</html> </html>

View File

@ -1,13 +0,0 @@
tests/*
.coverage
.env
.env.example
.report/*
.vscode/*
.pytest_cache/*
*/__pychache__/*
Dockerfile
pytest.ini
README.md
*.sql
*.gz

View File

@ -1,7 +0,0 @@
DB_HOST=************
DB_PORT=3306
DB_USERNAME=************
DB_PASSWORD=************
DB_SCHEMA=src05
DUMP_FILE_S3_PATH=*******************
LOG_LEVEL=INFO

View File

@ -1,16 +0,0 @@
.vscode/settings.json
.env
# python
__pycache__
# python test
.pytest_cache
.coverage
.report/
# mysql config file
my.cnf
# compress file
*.gz

View File

@ -1,17 +0,0 @@
{
// IntelliSense 使
//
// : https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "(DEBUG)restore dbdump",
"type": "python",
"request": "launch",
"program": "entrypoint.py",
"console": "integratedTerminal",
"justMyCode": true
}
]
}

View File

@ -1,31 +0,0 @@
{
"[python]": {
"editor.defaultFormatter": null,
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.organizeImports": true
}
},
//
"python.defaultInterpreterPath": "<pythonインタプリターのパス>",
"python.linting.lintOnSave": true,
"python.linting.enabled": true,
"python.linting.pylintEnabled": false,
"python.linting.flake8Enabled": true,
"python.linting.flake8Args": [
"--max-line-length=200",
"--ignore=F541"
],
"python.formatting.provider": "autopep8",
"python.formatting.autopep8Path": "autopep8",
"python.formatting.autopep8Args": [
"--max-line-length", "200",
"--ignore=F541"
],
"python.testing.pytestArgs": [
"tests/batch/ultmarc"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
}

View File

@ -1,45 +0,0 @@
FROM python:3.12-slim-bookworm
ENV TZ="Asia/Tokyo"
# pythonの標準出力をバッファリングしないフラグ
ENV PYTHONUNBUFFERED=1
# pythonのバイトコードを生成しないフラグ
ENV PYTHONDONTWRITEBYTECODE=1
WORKDIR /usr/src/app
COPY Pipfile Pipfile.lock ./
# mysql-apt-config をdpkgでインストールする際に標準出力に渡す文字列ファイルをコピー
COPY mysql_dpkg_selection.txt ./
# 必要なパッケージインストール
RUN apt update && apt install -y less vim curl wget gzip unzip sudo lsb-release
# mysqlをインストール
RUN \
wget https://dev.mysql.com/get/mysql-apt-config_0.8.29-1_all.deb && \
apt install -y gnupg && \
dpkg -i mysql-apt-config_0.8.29-1_all.deb < mysql_dpkg_selection.txt && \
apt update && \
apt install -y mysql-client
# aws cli v2 のインストール
RUN \
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && \
unzip awscliv2.zip && \
sudo ./aws/install
# python関連のライブラリインストール
RUN \
pip install --upgrade pip wheel setuptools && \
pip install pipenv --no-cache-dir && \
pipenv install --system --deploy && \
pip uninstall -y pipenv virtualenv-clone virtualenv
# パッケージのセキュリティアップデートのみを適用するコマンドを実行
RUN \
apt install -y unattended-upgrades && \
unattended-upgrades
COPY src ./src
COPY entrypoint.py entrypoint.py
CMD ["python", "entrypoint.py"]

View File

@ -1,16 +0,0 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
[dev-packages]
autopep8 = "*"
flake8 = "*"
[requires]
python_version = "3.12"
[pipenv]
allow_prereleases = true

View File

@ -1,63 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "2f7808325e11704ced6ad10c85e1d583663a03d7ccabaa9696ab1fe133a6b30c"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.12"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {},
"develop": {
"autopep8": {
"hashes": [
"sha256:89440a4f969197b69a995e4ce0661b031f455a9f776d2c5ba3dbd83466931758",
"sha256:ce8ad498672c845a0c3de2629c15b635ec2b05ef8177a6e7c91c74f3e9b51128"
],
"index": "pypi",
"markers": "python_version >= '3.9'",
"version": "==2.3.2"
},
"flake8": {
"hashes": [
"sha256:1cbc62e65536f65e6d754dfe6f1bada7f5cf392d6f5db3c2b85892466c3e7c1a",
"sha256:c586ffd0b41540951ae41af572e6790dbd49fc12b3aa2541685d253d9bd504bd"
],
"index": "pypi",
"markers": "python_full_version >= '3.8.1'",
"version": "==7.1.2"
},
"mccabe": {
"hashes": [
"sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325",
"sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"
],
"markers": "python_version >= '3.6'",
"version": "==0.7.0"
},
"pycodestyle": {
"hashes": [
"sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3",
"sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521"
],
"markers": "python_version >= '3.8'",
"version": "==2.12.1"
},
"pyflakes": {
"hashes": [
"sha256:1c61603ff154621fb2a9172037d84dca3500def8c8b630657d1701f026f8af3f",
"sha256:84b5be138a2dfbb40689ca07e2152deb896a65c3a3e24c251c5c62489568074a"
],
"markers": "python_version >= '3.8'",
"version": "==3.2.0"
}
}
}

View File

@ -1,67 +0,0 @@
# ダンプ復元スクリプト
## 概要
当処理は特定の機能で利用するものではなく、共通処理として要件に応じて実行することを想定している。
## 環境情報
- Python 3.9
- MySQL 8.23
- VSCode
## 環境構築
- Python の構築
- Merck_NewDWH 開発 2021 の Wiki、[Python 環境構築](https://nds-tyo.backlog.com/alias/wiki/1874930)を参照
- 「Pipenv の導入」までを行っておくこと
- 構築完了後、プロジェクト配下で以下のコマンドを実行し、Python の仮想環境を作成する
- `pipenv install --dev --python <pyenvでインストールしたpythonバージョン>`
- この手順で出力される仮想環境のパスは、後述する VSCode の設定手順で使用するため、控えておく
- MySQL の環境構築
- Windows の場合、以下のリンクからダウンロードする
- <https://dev.mysql.com/downloads/installer/>
- Docker を利用する場合、「newsdwh-tools」リポジトリの MySQL 設定を使用すると便利
- 「crm-table-to-ddl」フォルダ内で以下のコマンドを実行すると
- `docker-compose up -d`
- Docker の構築手順は、[Docker のセットアップ手順](https://nds-tyo.backlog.com/alias/wiki/1754332)を参照のこと
- データを投入する
- 立ち上げたデータベースに「src05」スキーマを作成する
- [ローカル開発用データ](https://ndstokyo.sharepoint.com/:f:/r/sites/merck-new-dwh-team/Shared%20Documents/03.NewDWH%E6%A7%8B%E7%AF%89%E3%83%95%E3%82%A7%E3%83%BC%E3%82%BA3/02.%E9%96%8B%E7%99%BA/90.%E9%96%8B%E7%99%BA%E5%85%B1%E6%9C%89/%E3%83%AD%E3%83%BC%E3%82%AB%E3%83%AB%E9%96%8B%E7%99%BA%E7%94%A8%E3%83%87%E3%83%BC%E3%82%BF?csf=1&web=1&e=VVcRUs)をダウンロードし、mysql コマンドを使用して復元する
- `mysql -h <ホスト名> -P <ポート> -u <ユーザー名> -p src05 < src05_dump.sql`
- 環境変数の設定
- 「.env.example」ファイルをコピーし、「.env」ファイルを作成する
- 環境変数を設定する。設定内容は PRJ メンバーより共有を受けてください
- VSCode の設定
- 「.vscode/recommended_settings.json」ファイルをコピーし、「settings.json」ファイルを作成する
- 「python.defaultInterpreterPath」を、Python の構築手順で作成した仮想環境のパスに変更する
## 実行
- VSCode 上で「F5」キーを押下すると、バッチ処理が起動する。
- 「entrypoint.py」が、バッチ処理のエントリーポイント。
- 実際の処理は、「src/restore_backup.py」で行っている。
## フォルダ構成
```txt
.
├── .env.example -- ローカル実行用の環境変数のサンプル値。
├── .dockerignore -- docker build時のコンテキストに含めるファイルの抑制リスト
├── .gitignore -- Git差分管理除外リスト
├── Dockerfile -- Dockerイメージ作成用
├── Pipfile -- pythonの依存関係管理
├── Pipfile.lock -- 依存関係バージョン固定
├── README.md -- 当ファイル
├── entrypoint.py -- エントリーポイントとなるファイル
├── mysql_dpkg_selection.txt -- Dockerイメージでdpkgを使うときに外部から注入する選択値
└── src -- ソースコードフォルダ
├── logging
│ └── get_logger.py -- ロガー
├── restore_backup.py -- dump復元処理本体
└── system_var
├── constants.py -- 定数ファイル
└── environment.py -- 環境変数ファイル
```

View File

@ -1,10 +0,0 @@
"""実消化&アルトマーク DBダンプ復元のエントリーポイント"""
from src import restore_backup
if __name__ == '__main__':
try:
exit(restore_backup.exec())
except Exception:
# エラーが起きても、正常系のコードで返す。
# エラーが起きた事実はbatch_process内でログを出す。
exit(0)

View File

@ -1,3 +0,0 @@
1
1
4

View File

@ -1,37 +0,0 @@
import logging
from src.system_var.environment import LOG_LEVEL
# boto3関連モジュールのログレベルを事前に個別指定し、モジュール内のDEBUGログの表示を抑止する
for name in ["boto3", "botocore", "s3transfer", "urllib3"]:
logging.getLogger(name).setLevel(logging.WARNING)
def get_logger(log_name: str) -> logging.Logger:
"""一意のログ出力モジュールを取得します。
Args:
log_name (str): ロガー名
Returns:
_type_: _description_
"""
logger = logging.getLogger(log_name)
level = logging.getLevelName(LOG_LEVEL)
if not isinstance(level, int):
level = logging.INFO
logger.setLevel(level)
if not logger.hasHandlers():
handler = logging.StreamHandler()
logger.addHandler(handler)
formatter = logging.Formatter(
'%(name)s\t[%(levelname)s]\t%(asctime)s\t%(message)s',
'%Y-%m-%d %H:%M:%S'
)
for handler in logger.handlers:
handler.setFormatter(formatter)
return logger

View File

@ -1,108 +0,0 @@
"""ダンプ復元スクリプト"""
import os
import subprocess
import textwrap
from src.logging.get_logger import get_logger
from src.system_var import constants, environment
logger = get_logger('ダンプ復元スクリプト')
def exec():
try:
logger.info('ダンプ復元スクリプト:開始')
# 事前処理(共通処理としては空振りする)
_pre_exec()
# メイン処理
# MySQL接続情報を作成する
my_cnf_file_content = f"""
[client]
user={environment.DB_USERNAME}
password={environment.DB_PASSWORD}
host={environment.DB_HOST}
"""
# my.cnfファイルのパス
my_cnf_path = os.path.join('my.cnf')
# my.cnfファイルを生成する
with open(my_cnf_path, 'w') as f:
f.write(textwrap.dedent(my_cnf_file_content)[1:-1])
os.chmod(my_cnf_path, 0o444)
# DBへの接続エラーを早期に検出するため、事前にMySQLサーバーに接続
mysql_pre_process = subprocess.Popen(
['mysql', f'--defaults-file={my_cnf_path}', '-P', f"{environment.DB_PORT}",
environment.DB_SCHEMA, '-N', '-e', 'SELECT 1;'],
stderr=subprocess.PIPE
)
_, error = mysql_pre_process.communicate()
if mysql_pre_process.returncode != 0:
logger.error(
f'MySQLサーバーへの接続に失敗しました。{"" if error is None else error.decode("utf-8")}')
return constants.BATCH_EXIT_CODE_SUCCESS
# 復元対象のダンプファイルを特定
s3_file_path = environment.DUMP_FILE_S3_PATH
# aws s3 cpコマンドを実行してdumpファイルをローカルにダウンロードする
s3_cp_process = subprocess.Popen(
['aws', 's3', 'cp', s3_file_path, './dump.gz'], stderr=subprocess.PIPE)
_, error = s3_cp_process.communicate()
if s3_cp_process.returncode != 0:
logger.error(
f'`aws s3 cp`実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
return constants.BATCH_EXIT_CODE_SUCCESS
# S3コマンドの標準エラーはクローズしておく
s3_cp_process.stderr.close()
# gzipコマンドを実行してdumpファイルを解凍する
gzip_process = subprocess.Popen(
['gunzip', '-c', './dump.gz'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# mysqlコマンドを実行し、dumpを復元する
mysql_process = subprocess.Popen(
['mysql', f'--defaults-file={my_cnf_path}', '-P',
f"{environment.DB_PORT}", environment.DB_SCHEMA],
stdin=gzip_process.stdout, stderr=subprocess.PIPE
)
# gzipの標準出力をmysqlに接続したため、標準出力をクローズする
gzip_process.stdout.close()
_, error = mysql_process.communicate()
if mysql_process.returncode != 0:
logger.error(
f'コマンド実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
return constants.BATCH_EXIT_CODE_SUCCESS
# 事後処理(共通処理としては空振りする)
_post_exec()
logger.info('[NOTICE]ダンプ復元スクリプト:終了(正常終了)')
return constants.BATCH_EXIT_CODE_SUCCESS
except Exception as e:
logger.exception(f'ダンプ復元スクリプト中に想定外のエラーが発生しました :{e}')
return constants.BATCH_EXIT_CODE_SUCCESS
def _pre_exec():
"""
ダンプ復元 事前処理
共通機能としては事前処理を実装しない
事前処理が必要なダンプ復元処理を実装する場合当ロジックをコピーする
"""
pass
def _post_exec():
"""
ダンプ復元 事後処理
共通機能としては事後処理を実装しない
事後処理が必要なダンプ復元処理を実装する場合当ロジックをコピーする
"""
pass

Some files were not shown because too many files have changed in this diff Show More