Compare commits

...

416 Commits

Author SHA1 Message Date
下田雅人
eee20d2653 Merge pull request #499 feature-NEWDWH2021-1744 into master 2025-06-16 11:06:44 +09:00
Nik Afiq
11c096d27f 2025年6月度脆弱性スキャン 2025-06-10 16:20:33 +09:00
朝倉 明日香
ffedc1d86d Merge pull request #478 release-v5.1.0 into master 2025-05-23 11:30:07 +09:00
下田雅人
381682243b Merge pull request #479 merge-conflict-20250523 into release-v5.1.0 2025-05-23 08:50:28 +09:00
shimoda.m@nds-tyo.co.jp
c80d437e56 Merge branch 'master' into merge-conflict-20250523 2025-05-23 08:49:42 +09:00
朝倉 明日香
5fc0e05e8c Merge pull request #472 feature-NEWDWH2021-1732 into master 2025-05-15 09:20:41 +09:00
Nik Afiq
370f598374 5月脆弱性スキャンのpipfileアップデート 2025-05-14 13:48:55 +09:00
小野 祥照
3ab9c3858c Merge pull request #470 feature-NEWDWH2021-1905-MailReplace into release-v5.1.0 2025-05-08 10:26:56 +09:00
mori.k
baf374118c メール本文中の「である」を「として、」に変更 2025-05-08 10:10:40 +09:00
mori.k
08c0239ec0 SAP受領/未受領通知の本文とタイトルの「Ⅰ/F」を「連携」に置換 2025-05-08 09:40:29 +09:00
朝倉 明日香
4f953da78c Merge pull request #469 feature-NEWDWH2021-1905 into release-v5.1.0 2025-05-08 07:02:53 +09:00
mori.k
f3b936db70 大文字のSAPを小文字のSAPに変更しました。 2025-05-07 18:23:02 +09:00
下田雅人
19df2f725b Merge pull request #466 feature-NEWDWH2021-1833 into master 2025-05-02 15:14:59 +09:00
下田雅人
e4c2d305d3 Merge pull request #463 feature-NEWDWH2021-1825 into develop-v5.1.0 2025-04-23 11:55:22 +09:00
shimoda.m@nds-tyo.co.jp
94e0325bec fix: numpyの要求Pythonバージョンが3.10以上になってしまったことにより、ったことにより、依存関係不足になり、Webアプリケーションが起動しない不具合を修正。 2025-04-21 10:05:38 +09:00
mori.k
a3d849a2b9 Merge branch 'master' into develop-v5.1.0 2025-04-17 14:22:08 +09:00
下田雅人
910322dc09 Merge pull request #465 featrue-NEWDWH2021-1831 into develop-v5.1.0 2025-04-17 13:19:06 +09:00
下田雅人
3e2beb998d Merge pull request #464 feature-NEWDWH2021-1743 into master 2025-04-17 12:57:21 +09:00
mori.k
c5426a4303 4月度脆弱性スキャンの資材 2025-04-16 17:51:31 +09:00
mori.k
39fdc3d60a 3.12アップデートを行った部分の脆弱性スキャンコマンドの修正 2025-04-14 14:55:28 +09:00
下田雅人
e607d2a71c Merge pull request #462 featrue-NEWDWH2021-1823 into develop-v5.1.0 2025-04-10 10:57:17 +09:00
mori.k
d57abee28d View セキュリティオプションチェックの問い合わせに関する文言を削除 2025-04-08 18:10:36 +09:00
mori.k
5a83987454 SAP関連 メルク様宛のアラート通知メールの文章変更 2025-04-08 15:27:29 +09:00
mori.k
7a74bba999 main.pyの変更を差し戻し 2025-04-04 10:19:22 +09:00
mori.k
c603eea19e 使用するpackageをgnupgからpython-gnupgに変更 2025-04-04 09:40:23 +09:00
mori.k
a173df8298 使用するイメージをpython:3.12-slim-bookwormに変更 2025-04-03 18:06:13 +09:00
mori.k
f51fb9c2a9 Pipfileに移行するためrequirements.txtを削除 2025-04-02 18:09:41 +09:00
mori.k
f5667157af public.ecr.aws/lambda/python:3.12イメージを使用 2025-04-02 17:57:39 +09:00
下田雅人
24546c3d7b Merge pull request #461 feature-NEWDWH2021-1820-ifx-dataimport-setting into develop-v5.1.0 2025-04-02 17:13:16 +09:00
下田雅人
53cd392262 Merge pull request #460 feature-NEWDWH2021-1820 into develop-v5.1.0 2025-04-02 17:12:47 +09:00
mori.k
5911220f9a スリム化の実施とPipfile及びPipfile.lockの再構成 2025-04-02 15:43:23 +09:00
mori.k
925acc70a5 不要項目の削除 2025-03-27 14:07:21 +09:00
mori.k
4d2d8cda83 レビュー指摘の過不足があった部分の修正 2025-03-27 13:40:33 +09:00
mori.k
345704f131 データ登録の設定ファイルの項目追加・削除の反映 2025-03-26 15:43:56 +09:00
mori.k
49d61ad6d7 CRMオブジェクトメンテナンスの追加・削除項目の変更 2025-03-26 09:40:32 +09:00
mori.k
8b8df71494 Merge branch 'develop-v5.1.0' into feature-NEWDWH2021-1820 2025-03-25 16:55:26 +09:00
mori.k
377b260c05 pipenv updateの実行 2025-03-25 16:39:40 +09:00
mori.k
8b511a948b Merge branch 'master' into develop-v5.1.0 2025-03-25 16:26:48 +09:00
mori.k
a46a42c333 Revert "CRMオブジェクトメンテナンスに伴うcrm_object_list_diffの最新化"
This reverts commit f7ebe9dc2be8268d1456e5f53adacefbaed0703a.
2025-03-25 13:30:24 +09:00
mori.k
69328196e8 Revert "今回変更を加えた項目のlast_fetch_datetimeを新規追加"
This reverts commit 623cf58a9474f50092f630a4c5aed8c9f34b7d47.
2025-03-25 13:29:14 +09:00
mori.k
f7ebe9dc2b CRMオブジェクトメンテナンスに伴うcrm_object_list_diffの最新化 2025-03-25 11:00:56 +09:00
mori.k
623cf58a94 今回変更を加えた項目のlast_fetch_datetimeを新規追加 2025-03-24 14:06:17 +09:00
朝倉 明日香
8985144263 Merge pull request #459 feature-NEWDWH2021-1738 into master 2025-03-17 11:22:16 +09:00
yono
7a22ab74ef refactor:脆弱性スキャン2025年3月度対応 2025-03-13 13:35:35 +09:00
下田雅人
19e5e1abc4 Merge pull request #458 feature-NEWDWH2021-1798 into develop-v5.1.0 2025-03-05 17:44:56 +09:00
下田雅人
2be5c5b229 Merge pull request #457 feature-NEWDWH2021-1796 into develop-v5.1.0 2025-03-05 13:16:01 +09:00
mori.k
1d3cb6ea67 verUP&スリム化 そのままbuildしたところgnupg is not installedとなったため、gnupgをインストールする一文を追加 2025-03-03 17:13:36 +09:00
mori.k
8dfef588bc verUP&スリム化 そのままbuildしたところgnupg is not installedとなったため、gnupgをインストールする一文を追加 2025-03-03 16:33:21 +09:00
下田雅人
bce0bbf3e9 Merge pull request #456 feature-NEWDWH2021-1795 into develop-v5.1.0 2025-02-28 15:25:47 +09:00
mori.k
2abb907779 WORKDIRに関わる記述を削除し、Pipfile、Pipfile.lockのコピー先を./に変更 2025-02-26 16:33:49 +09:00
mori.k
b718941395 スリム化の実施とPipfile及びPipfile.lockの再構成 2025-02-25 16:17:33 +09:00
朝倉 明日香
113bcd102e Merge pull request #455 release-v5.0.0 into master 2025-02-25 11:39:25 +09:00
下田雅人
0be701236b Merge pull request #453 feature-NEWDWH2021-1775 into develop-v5.1.0 2025-02-21 17:32:14 +09:00
下田雅人
c07360e8bb Merge pull request #454 feature-NEWDWH2021-1735 into master 2025-02-17 11:17:08 +09:00
下田雅人
062b0df54d Merge pull request #451 feature-NEWDWH2021-1770 into develop 2025-02-17 11:07:51 +09:00
mori.k
454a20f79c タグ付けを行う前にdocker pullを行うよう変更 2025-02-14 18:07:28 +09:00
yono
7c5c998284 refactor: 脆弱スキャンによる更新 2025-02-14 17:54:15 +09:00
mori.k
a6771be294 それぞれのテストケースの出力値を正しい書式に変更 2025-01-29 15:46:48 +09:00
mori.k
f811038c02 ツールのバージョンの記載を3.9.xから3.12.xに変更 2025-01-28 12:02:31 +09:00
mori.k
d32ae5b24a Pipfileのバージョン変更とそれに伴うPipfile.lockの変更 2025-01-28 10:31:23 +09:00
mori.k
519cb1c260 脆弱性スキャンツールのバージョンアップ対応 2025-01-27 16:39:20 +09:00
mori.k
b9bd007e91 CRMデータ取得のバージョンアップ対応及びログ出力の修正 2025-01-27 15:24:53 +09:00
下田雅人
f7b0fa09d0 Merge pull request #452 feature-NEWDWH2021-1774 into develop-v5.1.0 2025-01-27 09:33:29 +09:00
mori.k
08580204a2 脆弱性スキャンツールのスリム化に伴う修正 2025-01-24 14:43:48 +09:00
yono
9243343bbd feat:oneCIAM ID付けなおし対応でCRMの項目が増えたため設定ファイルの修正を行った 2025-01-15 18:51:45 +09:00
shimoda.m@nds-tyo.co.jp
2f38c1c526 feat: データ登録処理のPythonバージョンアップ・Dockerイメージスリム化 2025-01-09 14:28:04 +09:00
下田雅人
6a28baf983 Merge pull request #449 release-v4.8.0 into master 2024-12-19 09:24:17 +09:00
朝倉 明日香
4534922103 Merge pull request #447 feature-NEWDWH2021-1723-dataimport into develop 2024-12-16 11:18:32 +09:00
朝倉 明日香
4b278cb6ad Merge pull request #448 feather-NEWDWH2021-1376-calendar2025 into develop 2024-12-09 11:13:43 +09:00
Asuka Asakura
a1507f5cc1 2025年カレンダー 2024-12-09 09:30:01 +09:00
朝倉 明日香
95134e7348 Merge pull request #446 feature-NEWDWH2021-1723-crm-datafetch into develop 2024-12-02 16:22:59 +09:00
Asuka Asakura
69a9982c10 拡張SQLの不備修正 2024-12-02 16:20:48 +09:00
Asuka Asakura
778715148c Clm_Presentation_vod__c項目追加+履歴記録 2024-11-29 11:39:57 +09:00
shimoda.m@nds-tyo.co.jp
0a187fadd2 feat: Clm_Presentation_vod__cに2項目追加 2024-11-26 09:23:40 +09:00
朝倉 明日香
130dca2263 Merge pull request #445 master into develop 2024-11-12 13:23:24 +09:00
下田雅人
2b2077cbfc Merge pull request #444 feature-NEWDWH2021-1394 into master 2024-11-12 13:04:10 +09:00
Asuka Asakura
234c791282 2024年11月度脆弱性スキャン 2024-11-12 11:50:39 +09:00
下田雅人
beaf2ff9b6 Merge pull request #443 feature-NEWDWH2021-1710 into master 2024-11-11 11:26:59 +09:00
Nik Afiq
79bf28a4f7 ダイジェスト値ログに表示する 2024-11-09 14:02:08 +09:00
Nik Afiq
bfbfc5c7bb README更新 2024-11-09 14:01:41 +09:00
Nik Afiq
ac14adab01 ダイジェスト確認実装 2024-10-31 11:59:51 +09:00
下田雅人
89f33ccc8d Merge pull request #440 feature-NEWDWH2021-1717 into develop 2024-10-25 11:03:13 +09:00
朝倉 明日香
6f2e157103 Merge pull request #442 master into develop 2024-10-24 18:27:38 +09:00
朝倉 明日香
7910f33c6a Merge pull request #441 release-v4.7.0 into master 2024-10-24 18:20:33 +09:00
Asuka Asakura
872a8f7360 Encise社受領データの文字コードをUTF8に変更する 2024-10-24 11:58:07 +09:00
shimoda.m@nds-tyo.co.jp
7e634b9cc0 fix: 競合解決したらlockがおかしくなったので修正。 2024-10-24 09:21:29 +09:00
shimoda.m@nds-tyo.co.jp
6b4bc6a6ed Merge branch 'master' into release-v4.7.0 2024-10-24 09:15:26 +09:00
朝倉 明日香
d41d9c87b3 Merge pull request #439 feature-NEWDWH2021-1393 into master 2024-10-21 11:07:22 +09:00
shimoda.m@nds-tyo.co.jp
7f7b419bf3 feat: 10付き度脆弱性スキャンによる反映。 2024-10-15 10:51:08 +09:00
shimoda.m@nds-tyo.co.jp
f6588272cd fix: NEWDWH2021-794 の対応。重複行を削除。 2024-10-15 08:55:49 +09:00
下田雅人
6051b77709 Merge pull request #438 feature-NEWDWH2021-1699 into release-v4.7.0 2024-10-11 14:42:39 +09:00
shimoda.m@nds-tyo.co.jp
901643974e docs: パーミッション修正(所有者・グループにRDX権限、その他にR権限ん) 2024-10-10 14:44:11 +09:00
shimoda.m@nds-tyo.co.jp
379785b542 docs: レビュー指摘対応。サービス登録前にchmodする。 2024-10-10 14:42:30 +09:00
shimoda.m@nds-tyo.co.jp
5f513d974d docs: レビュー指摘対応。誤字修正。 2024-10-10 13:56:43 +09:00
shimoda.m@nds-tyo.co.jp
a6d74da063 docs: フォルダパス修正 2024-10-07 14:31:26 +09:00
shimoda.m@nds-tyo.co.jp
32c4e498f9 feat: 本番用の設定ファイルを追加 2024-10-07 10:47:57 +09:00
shimoda.m@nds-tyo.co.jp
55bbc69043 style: Descriptionをファイル名に合わせた 2024-10-07 10:47:43 +09:00
shimoda.m@nds-tyo.co.jp
a99e9d75dc docs: README追加 2024-10-07 10:47:27 +09:00
下田雅人
4d8f065de2 Merge pull request #437 develop into release-v4.7.0 2024-10-07 09:15:50 +09:00
下田雅人
77466af426 Merge pull request #436 feature-NEWDWH2021-1711 into develop 2024-10-02 09:07:15 +09:00
Asuka Asakura
1398cbe7bf Event_Attendee_vod__c項目追加 2024-10-01 14:53:12 +09:00
Asuka Asakura
8eaab2455c Event_Attendee_vod__c項目追加 2024-10-01 14:48:11 +09:00
朝倉 明日香
97d0c6f367 Merge pull request #434 feature-NEWDWH2021-1392 into master 2024-09-13 11:03:40 +09:00
下田雅人
50a4c0e615 Merge pull request #433 feature-NEWDWH2021-1568 into develop 2024-09-13 10:09:43 +09:00
朝倉 明日香
69af7b98be Merge pull request #432 feature-NEWDWH2021-1692 into develop 2024-09-13 07:57:08 +09:00
shimoda.m@nds-tyo.co.jp
bf611cd0cb fix: タグ再付与、本番へのPUSHはscan-pointタグ→ステージングlatest→本番とするように修正。
作業を途中で引き継いだ場合に本番へのPUSHができなくなるため。
2024-09-12 11:57:48 +09:00
shimoda.m@nds-tyo.co.jp
0a25b147e9 fix: crmデータ取得のバージョンミスがあったため、再ビルド 2024-09-12 11:50:13 +09:00
yuusuke_kanamura
2ffc80682d feat:インスタンス起動した際にsocatのプロセスを自動実行する処理の設定ファイルの作成 2024-09-10 17:57:36 +09:00
Asuka Asakura
31fcc86987 cd指定ミス修正 2024-09-10 10:26:36 +09:00
Asuka Asakura
62828d8e9b Userの取得NG項目を削除 2024-09-02 19:01:28 +09:00
Asuka Asakura
a6f19c2375 不要ファイル削除 2024-09-02 11:45:16 +09:00
朝倉 明日香
9098bdca6c Merge pull request #431 feature-NEWDWH2021-1686 into master 2024-09-02 11:14:27 +09:00
shimoda.m@nds-tyo.co.jp
a0cb93bcdb docs: READMEを更新 2024-08-30 16:52:20 +09:00
shimoda.m@nds-tyo.co.jp
b2cdb4dc2a fix: コマンドミスを修正 2024-08-30 16:42:48 +09:00
shimoda.m@nds-tyo.co.jp
e87dfd7aac feat: medパス社データ転送処理の脆弱性スキャンコマンドを追加 2024-08-30 16:35:32 +09:00
shimoda.m@nds-tyo.co.jp
de6069ead6 fix: シェルスクリプトのファイル名の誤りを修正 2024-08-30 16:29:21 +09:00
Asuka Asakura
d5f31e2c6d User取得NG項目ミス 2024-08-30 16:13:55 +09:00
Asuka Asakura
c20bf12c6e CRM Object Maintenance 2024-08-29 20:25:45 +09:00
Asuka Asakura
393bbd2bdc CRM Object Maintenance 2024-08-29 20:25:12 +09:00
Nik Afiq
5f3ef51454 README追加 2024-08-29 11:02:02 +09:00
Nik Afiq
8bdd70ba00 Readme追加 2024-08-27 11:49:28 +09:00
Nik Afiq
d4bcbddeae ECRアップデートスクリプト化実装 2024-08-27 11:42:24 +09:00
朝倉 明日香
99ef3a1d8a Merge pull request #429 master into develop 2024-08-26 11:03:25 +09:00
朝倉 明日香
edfc3cab8b Merge pull request #428 release-v4.6.0 into master 2024-08-26 11:03:00 +09:00
下田雅人
ae15dffef9 Merge pull request #427 fix-NEWDWH2021-1681-resolve-dependencies into develop 2024-08-21 16:26:24 +09:00
shimoda.m@nds-tyo.co.jp
39fd1383bd fix: pytestやpytest-htmlのバージョンが上がったことに伴い、単体テストが動かなくなっていたのを修正。また、s3のテストが元のモジュールの修正が適用されていなかったため、修正した。 2024-08-21 15:57:28 +09:00
shimoda.m@nds-tyo.co.jp
dd891aa548 fix: simple-salesforce 1.12.6で必要となる依存モジュールがlockファイルから抜けていたのを修正。 2024-08-21 15:12:36 +09:00
下田雅人
18f6e0391b Merge pull request #426 feature-NEWDWH2021-1681 into develop 2024-08-20 12:22:52 +09:00
shimoda.m@nds-tyo.co.jp
9a08d71bff Merge branch 'master' into feature-NEWDWH2021-1681 2024-08-20 11:10:53 +09:00
朝倉 明日香
dfc734f4d7 Merge pull request #424 master into release-v4.6.0 2024-08-19 11:35:07 +09:00
朝倉 明日香
6f3ecac69b Merge pull request #423 release-v4.5.0 into master 2024-08-19 11:33:26 +09:00
朝倉 明日香
812adb4641 Merge pull request #422 feature-NEWDWH2021-1391 into master 2024-08-19 11:32:33 +09:00
Nik Afiq
a616373297 8月脆弱性スキャン実施 2024-08-14 11:26:57 +09:00
下田雅人
8e6413caaf Merge pull request #421 feature-NEWDWH2021-1677 into release-v4.6.0 2024-08-05 13:39:09 +09:00
Nik Afiq
6f3f205a62 I-02-02ログから日付→データソースのみ表示に変更 2024-08-05 13:19:27 +09:00
朝倉 明日香
5027a72a50 Merge pull request #420 feature-NEWDWH2021-1666 into develop 2024-08-05 11:36:14 +09:00
shimoda.m@nds-tyo.co.jp
465cf4b8ac feat: CRMデータ取得のSalesforceAPIバージョンの最新化 2024-08-05 09:32:17 +09:00
下田雅人
c54f8a3836 Merge pull request #419 develop-v4.6.0 into develop 2024-07-24 13:05:30 +09:00
下田雅人
98dd3c6f81 Merge pull request #418 develop-v4.6.0 into release-v4.6.0 2024-07-24 13:03:01 +09:00
朝倉 明日香
a07edd5503 Merge pull request #417 feature-NEWDWH2021-1661 into develop-v4.6.0 2024-07-24 11:30:41 +09:00
yuusuke_kanamura
31c9ecba08 feat:CSV項目の末尾のカンマを削除 2024-07-23 11:43:45 +09:00
yuusuke_kanamura
e46ac4e2e2 feat:ダミー項目をヘッダ項目数から削除 2024-07-23 10:49:27 +09:00
yuusuke_kanamura
d049534ab9 feat:Medパス社データ仕様を吸収するためのダミー項目の追加対応 2024-07-22 17:53:57 +09:00
朝倉 明日香
ab1ed8c55a Merge pull request #416 develop into release-v4.5.0 2024-07-22 11:27:15 +09:00
朝倉 明日香
043d892eb2 Merge pull request #415 release-v4.4.0 into master 2024-07-22 11:19:52 +09:00
朝倉 明日香
ba80ca951e Merge pull request #413 feature-NEWDWH2021-1616 into develop-v4.6.0 2024-07-19 17:30:09 +09:00
Nik Afiq
9388ce0ee7 ログ表示変更 2024-07-19 16:30:51 +09:00
Nik Afiq
5d55d438ed 重複通知にファイルディレクトリ表示追加 2024-07-19 15:30:56 +09:00
朝倉 明日香
5a2fa9864a Merge pull request #414 feature-NEWDWH2021-1620 into develop-v4.6.0 2024-07-19 15:24:18 +09:00
Nik Afiq
994c40ec55 仕様に合わせた通知表示変更 2024-07-19 14:59:19 +09:00
朝倉 明日香
93945ae643 Merge pull request #409 feature-NEWDWH2021-1609 into develop-v4.6.0 2024-07-19 10:39:14 +09:00
Nik Afiq
610a0acdf1 ログ表示変更 2024-07-18 16:04:21 +09:00
shimoda.m@nds-tyo.co.jp
e0ed18efaf fix: エラーログのID修正 2024-07-18 14:16:48 +09:00
shimoda.m@nds-tyo.co.jp
1b3995d465 fix: ログ文言修正 2024-07-18 13:54:37 +09:00
Nik Afiq
63c00f61b7 Revert "ログ表示変更"
This reverts commit 84b2fbafab12acb6b09b3606d2080dba38f7de90.
2024-07-18 09:54:24 +09:00
Nik Afiq
84b2fbafab ログ表示変更 2024-07-17 18:13:36 +09:00
Nik Afiq
96a14447e6 正規表現修正 2024-07-17 16:27:51 +09:00
shimoda.m@nds-tyo.co.jp
cada4f7b09 style: コメント修正 2024-07-17 10:43:56 +09:00
shimoda.m@nds-tyo.co.jp
a058ef2d75 fix: ログからSNSにPublishする機能のSubjectが長すぎるとエラーになる問題を修正。 2024-07-17 10:38:30 +09:00
shimoda.m@nds-tyo.co.jp
9dbff34bf5 feat: 拡張SQL修正 2024-07-17 10:34:18 +09:00
shimoda.m@nds-tyo.co.jp
3c761cd038 feat: 正規表現の解釈方法を修正。コンフィグのキーの誤りを修正。 2024-07-17 10:29:59 +09:00
Nik Afiq
b01f488b08 誤字修正・本文からタイトル削除 2024-07-17 10:19:33 +09:00
yuusuke_kanamura
54c3646cc4 feat:NULL判定の追加 2024-07-17 10:13:49 +09:00
yuusuke_kanamura
fa68ea7d8a feat:文字型項目(_org)がNULL判定の削除 2024-07-17 10:12:40 +09:00
Nik Afiq
0d93f26bbf 設定ファイル追加 2024-07-17 09:54:08 +09:00
Nik Afiq
fd975d55c8 メール作成機能変更 2024-07-17 09:23:58 +09:00
Nik Afiq
384e069fb8 未受領チェック実装 2024-07-16 16:38:45 +09:00
shimoda.m@nds-tyo.co.jp
42fe7804c4 fix: Palantirのファイル名正規表現を修正 2024-07-16 16:09:00 +09:00
shimoda.m@nds-tyo.co.jp
838f5122fe feat: S3バケット間ファイル転送処理を追加 2024-07-16 16:06:04 +09:00
下田雅人
f2967be8f9 Merge pull request #412 master into release-v4.4.0 2024-07-16 14:42:04 +09:00
下田雅人
ef740bf707 Merge pull request #411 master into develop-v4.6.0 2024-07-16 14:41:30 +09:00
下田雅人
df428d8d2e Merge pull request #410 master into develop 2024-07-16 14:40:56 +09:00
下田雅人
fc4cf368fc Merge pull request #408 feature-NEWDWH2021-1390 into master 2024-07-16 11:28:13 +09:00
yuusuke_kanamura
8f0e0ec65f feat:ストアドファンクションの作成 2024-07-16 10:25:43 +09:00
yuusuke_kanamura
21126da1ba feat:拡張SQLの修正 2024-07-16 09:06:34 +09:00
yuusuke_kanamura
d3d3b95f34 feat:全角スペースを半角スペースに修正 2024-07-11 14:56:06 +09:00
yuusuke_kanamura
bc7795ee3d feat:hcp_web_access_logコメント修正 2024-07-11 14:46:03 +09:00
yuusuke_kanamura
473328c639 feat:SQLの予約語を大文字に修正、JST変換についてコメントを追記 2024-07-11 14:45:01 +09:00
yuusuke_kanamura
6ee8085a07 feat:HCPウェブ 拡張SQLの作成 2024-07-11 12:53:31 +09:00
yuusuke_kanamura
8a7fb7381c feat:HCPウェブ 個別設定ファイルの作成 2024-07-11 11:35:09 +09:00
yuusuke_kanamura
0506f20484 feat:HCPウェブ Viewオプションチェック対象スキーマ名ファイル修正 2024-07-11 11:32:48 +09:00
yuusuke_kanamura
0b5e0653d7 feat:HCPウェブ個別設定マッピングリストの作成 2024-07-11 11:31:35 +09:00
shimoda.m@nds-tyo.co.jp
ead269d5b3 feat: 2024年脆弱性スキャン対応のパッチ適用。 2024-07-10 12:47:39 +09:00
下田雅人
e3d8769c17 Merge pull request #406 feature-NEWDWH2021-1618 into develop-v4.6.0 2024-07-08 16:11:13 +09:00
Nik Afiq
bd76231de4 途中コミット 2024-07-04 17:27:26 +09:00
朝倉 明日香
f2e19041e6 Merge pull request #407 feature-NEWDWH2021-1633 into develop 2024-07-02 16:29:38 +09:00
yuusuke_kanamura
4afecb847b feat:朝倉さんの指摘修正対応のためコールのカラム名の修正 2024-07-02 14:23:40 +09:00
yuusuke_kanamura
2860614d2d feat:コールのカラム追加による、ストアドプロシージャーの修正 2024-07-02 14:16:58 +09:00
yuusuke_kanamura
a127f3e541 feat:オブジェクト追加対応、カラム追加による設定ファイルの修正 2024-07-02 10:40:05 +09:00
shimoda.m@nds-tyo.co.jp
7156fab12e feat: (たぶんないけど、)拡張子大文字の考慮 2024-07-01 16:30:24 +09:00
shimoda.m@nds-tyo.co.jp
ad8c18aa23 style: コメント修正 2024-07-01 16:23:15 +09:00
shimoda.m@nds-tyo.co.jp
d26f4c5a3f fix: ログ修正 2024-07-01 16:17:39 +09:00
shimoda.m@nds-tyo.co.jp
d16762a701 fix: ログ出力文言修正 2024-07-01 16:09:49 +09:00
shimoda.m@nds-tyo.co.jp
bc29d22526 feat: 設計に合わせて実装修正 2024-07-01 15:23:38 +09:00
shimoda.m@nds-tyo.co.jp
8397998bb3 feat: インストールするライブラリを最適化 2024-07-01 11:27:16 +09:00
shimoda.m@nds-tyo.co.jp
479fe869bf feat: python3.12に対応。Dockerfileを最適化。 2024-07-01 09:43:00 +09:00
shimoda.m@nds-tyo.co.jp
54d5961614 feat: 新規作成。コードベースは実装検証時のものを利用。 2024-06-27 17:31:34 +09:00
朝倉 明日香
fc49549004 Merge pull request #404 master into release-v4.4.0 2024-06-18 11:18:31 +09:00
朝倉 明日香
b3dc77fa44 Merge pull request #403 master into develop 2024-06-18 11:17:47 +09:00
朝倉 明日香
17ffbc136c Merge pull request #402 feature-NEWDWH2021-1389 into master 2024-06-18 11:16:46 +09:00
Nik Afiq
185953b359 6月脆弱性スキャン 2024-06-13 08:49:16 +09:00
下田雅人
318373a321 Merge pull request #401 develop into release-v4.4.0 2024-05-27 12:03:53 +09:00
下田雅人
3b5f19754c Merge pull request #400 master into develop 2024-05-27 12:01:20 +09:00
朝倉 明日香
ed49c81153 Merge pull request #395 feature-NEWDWH2021-880 into develop 2024-05-27 11:26:38 +09:00
朝倉 明日香
a7fe7f58c5 Merge pull request #399 release-emp-chg-inst into master 2024-05-27 11:24:41 +09:00
Nik Afiq
ea5b8a930d クラス関数の変数変更 2024-05-21 16:09:11 +09:00
朝倉 明日香
a5a577d301 Merge pull request #398 master into release-emp-chg-inst 2024-05-20 11:33:34 +09:00
朝倉 明日香
ac881db8f3 Merge pull request #397 master into develop 2024-05-20 11:32:43 +09:00
朝倉 明日香
3ce9772edb Merge pull request #396 feature-NEWDWH2021-1388 into master 2024-05-20 11:31:40 +09:00
Nik Afiq
4a2a1ce71a 脆弱性pipfile.lockファイルアップデート 2024-05-20 09:09:24 +09:00
Nik Afiq
de1e82273e getとread処理分けた・インターフェース命名変更 2024-05-15 11:19:00 +09:00
Nik Afiq
e55e95a6e9 sap-fin-receive-check-daily-fix 2024-05-15 10:13:40 +09:00
Nik Afiq
29780dab94 check-view-option-fix 2024-05-14 17:41:54 +09:00
Nik Afiq
98da72acb0 sap-sup-receive-check-monthly-fix 2024-05-14 17:35:08 +09:00
Nik Afiq
29f279702d sap-sup-monthly-data-fix 2024-05-14 17:21:43 +09:00
Nik Afiq
d0c37c962c sap-sup-receive-check-daily-fix 2024-05-14 16:57:17 +09:00
Nik Afiq
eb9b70cbb3 sap-fin-receive-check-monthly-fix 2024-05-14 16:22:55 +09:00
Nik Afiq
86945ad25d sap-fin-check-daily-fix 2024-05-13 17:52:53 +09:00
Nik Afiq
2ce1998418 sap-fin-monthly-data-fix 2024-05-13 16:29:42 +09:00
Nik Afiq
2471c86c6c sap-data-decrypt-fix 2024-05-13 16:17:00 +09:00
Nik Afiq
b5a6dde1a5 crm-datafech fix 2024-05-13 09:43:11 +09:00
下田雅人
2f0b9e35e2 Merge pull request #394 master into develop 2024-05-09 18:00:29 +09:00
朝倉 明日香
a7e90d87f6 Merge pull request #393 feature-NEWDWH2021-1557-fix-conflict into master 2024-05-09 11:24:29 +09:00
shimoda.m@nds-tyo.co.jp
feb9021241 fix: 削除されてしまった最終取得日時ファイルを追加 2024-05-09 10:40:23 +09:00
shimoda.m@nds-tyo.co.jp
0de20f7f8e fix: 追加されてしまった不要ファイルを削除 2024-05-09 10:40:19 +09:00
Nik Afiq
c43f5aa408 Merge branch 'develop' into feature-NEWDWH2021-880 2024-05-08 14:06:13 +09:00
Nik Afiq
1dc58e577f Merge branch 'develop' into feature-NEWDWH2021-880 2024-05-08 13:53:43 +09:00
下田雅人
82b4de78c5 Merge pull request #392 develop-emp-chg-inst into develop 2024-05-08 13:34:04 +09:00
朝倉 明日香
6fc8fa9104 Merge pull request #391 develop-emp-chg-inst into release-emp-chg-inst 2024-05-07 11:09:12 +09:00
朝倉 明日香
891cf0a97a Merge pull request #390 master into develop-emp-chg-inst 2024-04-26 15:36:38 +09:00
朝倉 明日香
2ee3161e30 Merge pull request #389 master into develop 2024-04-26 15:35:11 +09:00
Asuka Asakura
0eaeb270fb Merge branch 'master' of https://nds-tyo.backlog.com/git/NEWDWH2021/newsdwh2021 2024-04-26 15:24:52 +09:00
Asuka Asakura
6461e24a0f CRMエンハンス&オブジェクト追加対応 2024-04-26 15:24:33 +09:00
朝倉 明日香
e5034ef248 Merge pull request #388 release-crm-enhance into master 2024-04-26 15:18:03 +09:00
下田雅人
048ebdd602 Merge pull request #387 feature-NEWDWH2021-1543 into develop-emp-chg-inst 2024-04-26 11:42:54 +09:00
下田雅人
84cddc33d8 Merge pull request #386 feature-NEWDWH2021-1544 into develop-emp-chg-inst 2024-04-26 11:42:15 +09:00
shimoda.m@nds-tyo.co.jp
93c4613456 fix: 結合テストM-1-2ケース不具合。終了・担当者変更のSQLの条件指定漏れ 2024-04-25 16:40:23 +09:00
shimoda.m@nds-tyo.co.jp
469d2f46bc fix: 結合テスト不具合修正。
担当車種別の異なる施設担当者のレコードが無効化され、重複相手先コードのレコードが作成されない。
重複相手先コードの施設担当者レコードの検索条件に担当者種別コードの指定漏れ。
2024-04-25 10:35:59 +09:00
下田雅人
fa23ae3964 Merge pull request #385 feature-NEWDWH2021-1540 into develop-emp-chg-inst 2024-04-24 11:12:39 +09:00
shimoda.m@nds-tyo.co.jp
6208fb1ac3 feat: レビュー指摘修正。データ存在チェックのメッセージ生成に、項目論理名の定数を使用するように修正 2024-04-23 18:24:24 +09:00
shimoda.m@nds-tyo.co.jp
f830c55160 fix: 新規施設登録の重複エラー発生時に表示されるメッセージに担当者種別コードが含まれていない不具合を修正 2024-04-23 17:17:47 +09:00
下田雅人
5b47804c3f Merge pull request #384 feature-NEWDWH2021-1507 into develop-emp-chg-inst 2024-04-23 13:06:41 +09:00
shimoda.m@nds-tyo.co.jp
a726bb90fe feat: コメント削除、条件の順番修正 2024-04-23 13:01:47 +09:00
nik.n
cc0e1cb576 エラー判定順番変更 2024-04-23 11:27:15 +09:00
nik.n
18f215f4f6 SQL検索に担当者種別コード検索条件追加 2024-04-22 15:07:07 +09:00
nik.n
35993730d6 開始日未入力に対して条件追加 2024-04-22 10:32:55 +09:00
nik.n
ac2c2f0f16 start_dateが存在しないとき種別担当者判定を除外するロジック追加 2024-04-18 17:36:53 +09:00
nik.n
87d7e2d305 Merge branch 'feature-NEWDWH2021-1507' into develop-emp-chg-inst 2024-04-18 13:07:00 +09:00
下田雅人
a7d26090e5 Merge pull request #383 master into develop-emp-chg-inst 2024-04-18 11:48:44 +09:00
下田雅人
9f78307498 Merge pull request #382 master into release-crm-enhance 2024-04-18 11:47:21 +09:00
下田雅人
7e8b26c378 Merge pull request #381 master into develop 2024-04-18 11:46:04 +09:00
朝倉 明日香
88e8b1a0b1 Merge pull request #380 feature-NEWDWH2021-1387 into master 2024-04-18 11:16:24 +09:00
朝倉 明日香
63c4d149d8 Merge pull request #377 feature-NEWDWH2021-1507-download into develop-emp-chg-inst 2024-04-18 11:11:18 +09:00
nik.n
b878ded447 指摘修正 2024-04-18 10:27:08 +09:00
nik.n
e29bbe4455 施設担当者変更実装・指摘修正 2024-04-17 18:45:07 +09:00
shimoda.m@nds-tyo.co.jp
489e96ffb9 feat: DBダンプ取得/復元のDockerイメージで使用しているdebian OSのリリースバージョンを変更(bullseye→bookworm)
Python:3.9イメージは、バージョン名を指定しない場合bookwormになる
2024-04-17 13:15:50 +09:00
朝倉 明日香
267ae37ec9 Merge pull request #376 feature-NEWDWH2021-1503 into develop-emp-chg-inst 2024-04-17 11:10:42 +09:00
nik.n
df0f2d129f 担当者種別コードCSVアップロード実装 2024-04-17 10:23:33 +09:00
shimoda.m@nds-tyo.co.jp
d2e6d3bf8d feat: 2024年4月脆弱性スキャン対応 2024-04-17 09:33:16 +09:00
下田雅人
0a6fca9aa3 Merge pull request #378 master into release-crm-enhance 2024-04-17 09:32:35 +09:00
nik.n
9576dc054a 中途プッシュ 2024-04-16 16:57:35 +09:00
shimoda.m@nds-tyo.co.jp
f9557f86f5 style: コメント修正 2024-04-16 16:44:03 +09:00
shimoda.m@nds-tyo.co.jp
1a95ca1c84 fix: CSVダウンロード実行後の初期表示を修正 2024-04-16 16:41:47 +09:00
shimoda.m@nds-tyo.co.jp
f897f34672 Merge branch 'develop-emp-chg-inst' into feature-NEWDWH2021-1507-download 2024-04-16 11:49:18 +09:00
shimoda.m@nds-tyo.co.jp
8c8be962a1 fix: jst化の不備修正 2024-04-16 11:25:12 +09:00
朝倉 明日香
d6f69905ae Merge pull request #372 feature-NEWDWH2021-1513-new into develop-emp-chg-inst 2024-04-16 11:10:57 +09:00
shimoda.m@nds-tyo.co.jp
5acca202ac feat: 施設担当者マスタCSVダウンロード画面の修正 2024-04-15 18:05:00 +09:00
shimoda.m@nds-tyo.co.jp
240945b8d5 style: 不要なコメントを削除 2024-04-15 17:38:37 +09:00
shimoda.m@nds-tyo.co.jp
db0e28db3e fix: ユーザーマスタ更新時のタイムゾーンをJSTに変更(共通関数を使用) 2024-04-15 15:29:10 +09:00
nik.n
a7bc502f36 中途プッシュ 2024-04-15 08:56:26 +09:00
朝倉 明日香
528b87fda9 Merge pull request #375 master into develop 2024-04-12 17:31:01 +09:00
朝倉 明日香
961c1b4908 Merge pull request #374 release-202404 into master 2024-04-12 17:01:38 +09:00
nik.n
166dde4848 仕様に合わせて条件順員変更 2024-04-11 09:06:18 +09:00
shimoda.m@nds-tyo.co.jp
e06d88b747 feat: ログ修正 2024-04-10 09:37:55 +09:00
nik.n
eb7d8bfb39 日付記録GMT+9に変更 2024-04-09 14:10:28 +09:00
nik.n
fb04d6fc48 10回ログイン失敗判定条件変更 2024-04-04 16:53:43 +09:00
nik.n
4d09f9973a 不要なDB更新処理を削除 2024-04-04 16:41:54 +09:00
nik.n
cfdac2b9f4 SQL更新ロジック変更・失敗判定変更 2024-04-04 10:08:08 +09:00
shimoda.m@nds-tyo.co.jp
1a218c6861 feat: 施設担当者マスタ洗替 修正 2024-04-04 09:59:23 +09:00
shimoda.m@nds-tyo.co.jp
05c120b56e feat: DCF施設統合マスタ日次更新 修正 2024-04-04 09:58:04 +09:00
nik.n
ac5cfc4d0f ログイン失敗判定ロジッククラス変更 2024-04-03 17:16:08 +09:00
nik.n
fa3100b830 クラス関数削除 2024-04-03 16:20:18 +09:00
nik.n
7a65e2b46e ロジック変更 2024-04-03 16:05:41 +09:00
下田雅人
f0e67a07b5 Merge pull request #373 develop into release-crm-enhance 2024-04-03 15:16:37 +09:00
nik.n
000e9c006f 仕様に合わせてロジック修正 2024-04-03 10:57:54 +09:00
nik.n
4270582d7a 命名指摘修正・有効フラグ判定削除 2024-04-02 17:50:50 +09:00
nik.n
5d9b692982 アカウントロックアウト実装 2024-04-02 15:42:02 +09:00
朝倉 明日香
b726e9fa34 Merge pull request #371 feature-NEWDWH2021-1512 into release-202404 2024-04-02 11:18:23 +09:00
Asuka Asakura
2856b0b406 コマーシャルイベントに1項目追加対応 2024-04-01 16:03:26 +09:00
下田雅人
9ac9dd8376 Merge pull request #370 master into develop 2024-04-01 15:00:14 +09:00
朝倉 明日香
80ddf8302b Merge pull request #369 release-8-encise-automation into master 2024-04-01 11:22:04 +09:00
shimoda.m@nds-tyo.co.jp
1c7e0c2e19 Merge branch 'release-8-encise-automation' of nds-tyo.git.backlog.com:/NEWDWH2021/newsdwh2021 into release-8-encise-automation 2024-04-01 09:40:01 +09:00
shimoda.m@nds-tyo.co.jp
d68afb1485 Merge branch 'master' into release-8-encise-automation 2024-04-01 09:39:42 +09:00
下田雅人
316e12677e Merge pull request #367 feature-NEWDWH2021-1495 into release-8-encise-automation 2024-03-28 10:44:06 +09:00
nik.n
b853c71b35 DBダンプSG値追加 2024-03-28 10:39:43 +09:00
下田雅人
4bd009cd13 Merge pull request #365 feature-NEWDWH2021-1497-newbranch into release-8-encise-automation 2024-03-25 12:19:22 +09:00
朝倉 明日香
5167d1e5f8 Merge pull request #358 feature-NEWDWH2021-1473 into develop 2024-03-21 15:11:59 +09:00
朝倉 明日香
abe68940bf Merge pull request #357 feature-NEWDWH2021-1472 into develop 2024-03-21 15:11:43 +09:00
朝倉 明日香
6febed9ccd Merge pull request #356 feature-NEWDWH2021-1471 into develop 2024-03-21 15:11:17 +09:00
朝倉 明日香
b8748a8dfb Merge pull request #366 master into develop 2024-03-21 15:10:59 +09:00
朝倉 明日香
b0affc42d8 Merge pull request #362 feature-NEWDWH2021-1386 into master 2024-03-21 15:09:40 +09:00
nik.n
f97f80f84e バックアップ用のContentType指定 2024-03-19 17:07:50 +09:00
nik.n
6fd5ada477 送信ファイルをCSVに指定ロジック追加 2024-03-19 11:50:55 +09:00
nik.n
3f506a90f3 脆弱性スキャンのpipfile更新 2024-03-18 16:01:14 +09:00
shimoda.m@nds-tyo.co.jp
c1289ff1e5 feat: Userオブジェクトの項目定義書から消えた2項目は引き続き連携するため、末尾の項目として登録する 2024-03-14 15:43:30 +09:00
shimoda.m@nds-tyo.co.jp
3a18e30a4a feat: Userオブジェクトの項目定義書から消えた2項目は引き続き連携するため、末尾の項目として取得する 2024-03-14 15:29:10 +09:00
朝倉 明日香
968b10047e Merge pull request #359 feature-NEWDWH2021-1487 into develop 2024-03-14 11:20:51 +09:00
shimoda.m@nds-tyo.co.jp
74de4d2251 fix: インポート方法に不備があったのを修正 2024-03-14 10:02:33 +09:00
朝倉 明日香
1bb28fc083 Merge pull request #361 feature-NEWDWH2021-1496 into release-8-encise-automation 2024-03-12 11:07:15 +09:00
shimoda.m@nds-tyo.co.jp
6dafd8215a feat: 一気通貫テストで処理のログをファイルに書き出すようにした 2024-03-12 10:15:58 +09:00
shimoda.m@nds-tyo.co.jp
c47af35775 feat: .logファイルをgit管理対象から除外 2024-03-12 10:13:35 +09:00
shimoda.m@nds-tyo.co.jp
6f814e83a4 fix: S11 #13, #14 未受領通知年月チェック処理の不備修正 2024-03-11 14:58:34 +09:00
下田雅人
8cd7e3629a Merge pull request #360 master into develop 2024-03-08 15:52:35 +09:00
下田雅人
408330ba77 Merge pull request #355 release-202403 into master 2024-03-08 15:47:04 +09:00
shimoda.m@nds-tyo.co.jp
731f8d4e7b feat: レビュー指摘対応。項目数の修正漏れ 2024-03-08 14:25:44 +09:00
shimoda.m@nds-tyo.co.jp
ce4d44cbb8 feat: レビュー指摘対応。項目数の修正漏れ 2024-03-08 14:24:00 +09:00
nik.n
2615538807 日付フォーマット変更 2024-03-08 09:32:03 +09:00
nik.n
7b7ff612f5 日付フォーマット漏れ修正・AS 2024-03-07 17:28:00 +09:00
nik.n
6ecb2b0f0c 抽出条件変更 2024-03-07 16:46:42 +09:00
nik.n
fb29466a4a メルク施設マスタ作成プログラム修正 2024-03-07 14:20:57 +09:00
shimoda.m@nds-tyo.co.jp
4cf7d5e66c feat: 個別設定マッピングリストを修正(差分7本、全件3本) 2024-03-06 17:58:22 +09:00
shimoda.m@nds-tyo.co.jp
b67eb9d543 feat: 拡張SQL追加(履歴管理3本、データ同期3本) 2024-03-06 17:54:49 +09:00
shimoda.m@nds-tyo.co.jp
5caf62d636 feat: 全件連携追加3本(承認プロセス3つ) 2024-03-06 17:49:47 +09:00
shimoda.m@nds-tyo.co.jp
2d9fa09ce8 feat: 差分追加オブジェクト7本追加(承認プロセス・ディレクトリ) 2024-03-06 17:46:15 +09:00
shimoda.m@nds-tyo.co.jp
0767305ec6 feat: カラム順序修正 2024-03-06 15:14:18 +09:00
shimoda.m@nds-tyo.co.jp
d8db17873c feat: データ登録個別設定ファイル修正 カラム順序変更分6本 2024-03-06 15:10:13 +09:00
shimoda.m@nds-tyo.co.jp
3844088570 feat: データ登録個別設定ファイル修正 新規カラム追加分15本 2024-03-06 14:54:28 +09:00
shimoda.m@nds-tyo.co.jp
6f10368c2d feat: 最終取得日時ファイルを追加 差分7本+,全件3本 2024-03-06 13:50:58 +09:00
shimoda.m@nds-tyo.co.jp
60703222af feat: 全件取得設定修正。新規追加オブジェクト3つ 2024-03-06 11:48:19 +09:00
shimoda.m@nds-tyo.co.jp
4e1ab7808e fix: リストになってしまったのを修正 2024-03-06 10:56:16 +09:00
shimoda.m@nds-tyo.co.jp
3b8abba545 feat: 差分取得設定修正。新規追加オブジェクト7つ 2024-03-06 10:54:43 +09:00
shimoda.m@nds-tyo.co.jp
b42e88b754 feat: 差分取得設定修正。順序変更のみ。 2024-03-06 10:45:53 +09:00
shimoda.m@nds-tyo.co.jp
86f3783d67 feat: 取得設定修正。カラム追加分。 2024-03-06 10:37:53 +09:00
shimoda.m@nds-tyo.co.jp
3e6d4e8f0f fix: レビュー指摘修正。カンマを追加 2024-03-06 10:33:05 +09:00
shimoda.m@nds-tyo.co.jp
10b57c1d35 style: 不要な改行削除 2024-03-06 10:06:18 +09:00
shimoda.m@nds-tyo.co.jp
421b360a56 feat: 修正漏れを追加 2024-03-06 10:03:10 +09:00
shimoda.m@nds-tyo.co.jp
46140c7818 feature: カラム追加に伴い、プロシージャを修正 2024-03-06 09:55:17 +09:00
下田雅人
f5d5eda1d7 Merge pull request #354 develop into release-8-encise-automation 2024-03-04 15:55:20 +09:00
shimoda.m@nds-tyo.co.jp
5a71bf972f MSJ_Patient__cの対応 2024-02-27 13:41:00 +09:00
shimoda.m@nds-tyo.co.jp
42d3d80199 Merge remote-tracking branch 'origin/develop-fix-webapp-vulnerability' into release-202403 2024-02-27 13:37:31 +09:00
朝倉 明日香
66f8408268 Merge pull request #352 feature-NEWDWH2021-1441 into develop 2024-02-27 11:56:32 +09:00
朝倉 明日香
5ee9d0f82d Merge pull request #349 feature-NEWDWH2021-1441-db-restore into develop 2024-02-27 11:55:54 +09:00
朝倉 明日香
f27ed97aa9 Merge pull request #348 feature-NEWDWH2021-1441-db-export into develop 2024-02-27 11:54:56 +09:00
朝倉 明日香
01356a8735 Merge pull request #346 feature-NEWDWH2021-1439 into develop 2024-02-27 11:15:30 +09:00
朝倉 明日香
c13c5463a5 Merge pull request #353 master into develop 2024-02-27 11:13:16 +09:00
朝倉 明日香
069a8c7450 Merge pull request #351 feature-NEWDWH2021-1385 into master 2024-02-27 11:12:08 +09:00
shimoda.m@nds-tyo.co.jp
0c00074380 fix: DB接続を事前に行い、エラーを抑制 2024-02-26 18:57:42 +09:00
shimoda.m@nds-tyo.co.jp
522c48252c fix: ステートマシン定義の誤りを修正 2024-02-26 17:41:00 +09:00
朝倉 明日香
2c63ef7cf3 Merge pull request #350 feature-NEWDWH2021-1460 into develop-fix-webapp-vulnerability 2024-02-26 09:43:28 +09:00
shimoda.m@nds-tyo.co.jp
b93b302423 feat: 2024年2月脆弱性スキャン対応 2024-02-22 17:25:39 +09:00
shimoda.m@nds-tyo.co.jp
b8db5c11ab feat: integrityの方針を変更。外部から読み込んでいるものにはつけないようにした。 2024-02-22 16:13:20 +09:00
shimoda.m@nds-tyo.co.jp
07ec3d28d5 fix: flatpickr-l10n-ja.jsのintegrityが間違っており、読み込めていなかった。これが影響してDatePickerが動作していなかったのを修正。 2024-02-21 19:32:33 +09:00
nik.n
37b9bc6112 前後処理prepost_exec追加 2024-02-21 17:07:24 +09:00
nik.n
b480424354 確認用不要コード削除 2024-02-21 16:52:19 +09:00
nik.n
9400a7f5ff 指摘修正・prepostexec追加 2024-02-21 16:48:12 +09:00
nik.n
0459668032 不要モジュール削除 2024-02-21 16:19:23 +09:00
nik.n
22d8ee4465 dbrestore-first-commit 2024-02-21 14:26:04 +09:00
nik.n
4b5197bebe dbdump-first-commit 2024-02-20 18:00:49 +09:00
下田雅人
672856df1a Merge pull request #347 feature-NEWDWH2021-1436:基盤実装 into develop 2024-02-19 14:30:47 +09:00
nik.n
511feeb15f TaskDefinition修正 2024-02-16 10:17:59 +09:00
nik.n
4900e1115d 指摘内容修正対応 2024-02-15 16:45:46 +09:00
nik.n
08df459ed8 タスク名変更 2024-02-15 16:20:36 +09:00
nik afiq
3dd7b2482d Merge pull request #345 feature-NEWDWH2021-1445 into develop 2024-02-15 14:46:03 +09:00
nik.n
c8ad52db1f r-export-dbdump追加 2024-02-15 13:55:20 +09:00
shimoda.m@nds-tyo.co.jp
b0f39ea79d style: フォーマット変更 2024-02-13 17:12:31 +09:00
shimoda.m@nds-tyo.co.jp
d5d5b719f7 feat: ローカル動作確認用のコードをコメントで追加 2024-02-13 16:26:20 +09:00
shimoda.m@nds-tyo.co.jp
f1ce484d1d feat: チェック対象年月のロジック追加 2024-02-13 16:24:16 +09:00
shimoda.m@nds-tyo.co.jp
1153cfa8b5 fix: ファイルの正規表現のミスを修正 2024-02-13 16:21:44 +09:00
shimoda.m@nds-tyo.co.jp
2b7031f685 feat: 、通知メールテンプレートファイルを作成 2024-02-13 14:48:44 +09:00
shimoda.m@nds-tyo.co.jp
d2485a4afe feat: 未受領チェック対象ファイルリストを作成 2024-02-13 14:45:58 +09:00
shimoda.m@nds-tyo.co.jp
5cadb1a466 feat: DynamoDBテーブルのレコード有効期限用の項目を追加 2024-02-13 14:02:08 +09:00
shimoda.m@nds-tyo.co.jp
d3615e38c4 feat: NDS通知SNSは不要なので削除 2024-02-13 13:46:50 +09:00
shimoda.m@nds-tyo.co.jp
41542bbd9d feat: lambda_handlerに例外処理追加、コメント追加、未受領チェック関数名修正 2024-02-13 13:38:43 +09:00
shimoda.m@nds-tyo.co.jp
8ca83fcc74 style: 不要なコメント削除 2024-02-13 13:34:46 +09:00
shimoda.m@nds-tyo.co.jp
58e3a182bc feat: 想定外のエラー発生時のエラー処理実装 2024-02-13 13:33:45 +09:00
shimoda.m@nds-tyo.co.jp
29dccb84ff style: エラー処理実装TODOコメント削除 2024-02-13 13:23:03 +09:00
shimoda.m@nds-tyo.co.jp
43d959f669 feat: エラー処理を実装(まだ途中) 2024-02-13 13:04:36 +09:00
shimoda.m@nds-tyo.co.jp
add64b3bc8 feat: Encise自動連携 データ未受信チェック処理実装。細かいロジックの実装がまだ 2024-02-13 12:49:11 +09:00
shimoda.m@nds-tyo.co.jp
ee2557928c style: タブ→スペースに変換 2024-02-09 10:42:01 +09:00
shimoda.m@nds-tyo.co.jp
866db319ea feat: Enciseデータ転送処理を実装 2024-02-09 10:35:52 +09:00
朝倉 明日香
3c23bb94cc Merge pull request #344 fix-NEWDWH2021-1450 into develop 2024-02-07 18:30:44 +09:00
Asuka Asakura
27fd3536d9 誤ったタブの除去 2024-02-06 09:32:24 +09:00
下田雅人
e362f124bb Merge pull request #343 feature-NEWDWH2021-1450 into develop 2024-02-02 12:50:47 +09:00
Asuka Asakura
930b393196 MSJ_Patient__c の履歴管理と項目追加 2024-02-02 12:01:15 +09:00
Asuka Asakura
cd00e24966 MSJ_Patient__c の履歴管理と項目追加 2024-02-02 11:57:41 +09:00
nik afiq
595505d129 Merge pull request #342 feature-NEWDWH2021-1435 into develop-fix-webapp-vulnerability 2024-01-30 15:00:38 +09:00
shimoda.m@nds-tyo.co.jp
e217d2ed23 refactor: 使用していないCSSライブラリを削除 2024-01-30 13:52:02 +09:00
shimoda.m@nds-tyo.co.jp
125b57dd25 feat: 使用するCDNを統一 2024-01-30 13:51:36 +09:00
shimoda.m@nds-tyo.co.jp
f93bacd41f fix: flatpickr/dist/l10n/ja.min.jsのハッシュ値が異なり、Failed to find a valid digest in the 'integrity' attribute for resourceのエラーが発生していたのを修正 2024-01-30 13:50:23 +09:00
下田雅人
1738b406ba Merge pull request #337 feature-NEWDWH2021-1408 into develop-fix-webapp-vulnerability 2024-01-30 09:37:29 +09:00
nik.n
c5d99acf1b フォルダ構成更新 2024-01-30 09:29:14 +09:00
nik.n
172c6e070b 不要なエラーハンドラー削除 2024-01-30 09:21:04 +09:00
shimoda.m@nds-tyo.co.jp
ff6dd0b68a feat: SRIハッシュ値について記述を修正 2024-01-29 17:21:31 +09:00
nik.n
3feca4d25c ミドルウェア実装 2024-01-29 16:50:31 +09:00
下田雅人
9f487bce35 Merge pull request #341 master into develop-fix-webapp-vulnerability 2024-01-29 13:46:05 +09:00
nik.n
46fa3844ab SRIの注意事項・hashを生成方法追加記載 2024-01-29 13:46:05 +09:00
nik.n
a435c51bc7 X-Frame-Optionsヘッダー実装 2024-01-25 16:08:05 +09:00
nik.n
426426b278 ヘッダーCache-Control・X-Content-Type-Options・Strict-Transport-Security実装 2024-01-24 15:11:49 +09:00
nik.n
484e77abc7 セキュリティヘッダー追加 2024-01-23 09:05:11 +09:00
下田雅人
20aa4e8f24 Merge pull request #336 feature-NEWDWH2021-1406 into develop-fix-webapp-vulnerability 2024-01-18 15:51:54 +09:00
nik.n
88d8985058 不要なコメント削除 2024-01-18 15:40:48 +09:00
下田雅人
64a25228a1 Merge pull request #333 feature-NEWDWH2021-1405 into develop-fix-webapp-vulnerability 2024-01-18 15:07:23 +09:00
下田雅人
3e0548d0bc Merge pull request #335 feature-NEWDWH2021-1377 into develop-fix-webapp-vulnerability 2024-01-18 14:36:28 +09:00
下田雅人
565f776a5c Merge pull request #332 feature-NEWDWH2021-1404 into develop-fix-webapp-vulnerability 2024-01-18 14:24:39 +09:00
nik.n
fe1e73e52d APIドキュメント非表示する 2024-01-16 16:22:02 +09:00
nik.n
6371924bd9 '指摘対応:import修正・呼び出し修正' 2024-01-16 14:00:57 +09:00
nik.n
5d4a237904 max_age削除 2024-01-15 17:32:17 +09:00
Nik Afiq
5c49c92b6b '指摘修正' 2024-01-15 17:12:39 +09:00
Nik Afiq
ead12039a7 指摘修正 2024-01-15 17:11:46 +09:00
nik.n
d869a91eae 修正 2024-01-12 16:10:25 +09:00
shimoda.m@nds-tyo.co.jp
b8975ad453 feat: データ登録処理 メイン処理内のresourceインタフェースをクライアントインタフェースに変更 2023-01-30 11:17:41 +09:00
shimoda.m@nds-tyo.co.jp
7a21b22861 feat: データ登録処理 エラー処理内のresourceインタフェースをクライアントインタフェースに変更 2023-01-30 10:33:19 +09:00
shimoda.m@nds-tyo.co.jp
7a86e834dc feat: ローカルで実行できるように実行用コードを追加。普段はコメントアウト。 2023-01-27 17:58:29 +09:00
shimoda.m@nds-tyo.co.jp
419bad2690 feat: データ登録処理 終了処理内のresourceインタフェースをクライアントインタフェースに変更 2023-01-27 17:55:09 +09:00
shimoda.m@nds-tyo.co.jp
d5c6485a72 feat: データ登録処理 チェック処理内のresourceインタフェースをクライアントインタフェースに変更 2023-01-27 16:22:24 +09:00
shimoda.m@nds-tyo.co.jp
1a1499a189 feat: データ登録処理 初期処理内のresourceインタフェースをクライアントインタフェースに変更 2023-01-27 16:10:27 +09:00
286 changed files with 8011 additions and 3136 deletions

3
.gitignore vendored
View File

@ -15,3 +15,6 @@ stepfunctions/*/build
# python test # python test
.coverage .coverage
.report/ .report/
# log
.log

62
ec2/README.md Normal file
View File

@ -0,0 +1,62 @@
# EC2インスタンス管理資材
## NLBゲートウェイ インスタンス 起動スクリプト
### 目的
Merck様のDB接続経路としてのNLBから、Auroraデータベースに踏み台アクセスを行うため、ゲートウェイの機能を提供するEC2インスタンスを稼働している。
EC2インスタンス内では、NLBからの任意のポートをAuroraデータベースのポートにフォワーディングするために、`socat`プロセスを動かす必要がある。
`socat`プロセスを動かすためのコマンドと、EC2インスタンス起動時にプロセスを実行するためのsystemdの設定を配置している。
### フォルダ構成
```txt
.
├── README.md -- 当ファイル
└── gateway
├── staging -- ステージング環境用設定
│ ├── public1-1 -- ap-northeast-1aのインスタンス要設定
│ │ ├── socat-dbconnection-1a.service -- systemdにsocatプロセスを登録するためのファイル
│ │ └── socat-portforwarding-1a.sh -- socatでAuroraデータベースにポートフォワーディングするためのシェルスクリプト
│ └── public2-1 -- ap-northeast-1dのインスタンス要設定
│ ├── socat-dbconnection-1d.service
│ └── socat-portforwarding-1d.sh
├── product -- 本番環境用設定
│ ├── public1-1
│ │ ├── socat-dbconnection-1a.service
│ │ └── socat-portforwarding-1a.sh
│ └── public2-1
│ ├── socat-dbconnection-1d.service
│ └── socat-portforwarding-1d.sh
```
### ファイル配置方法(両環境共通)
- 対象のゲートウェイEC2インスタンスにログインする
- セッションマネージャーでログインした場合は、`ec2-user`に切り替えること(`sudo su --login ec2-user`)
- 以下の操作を実行し、`socat`プロセスを起動するシェルスクリプトを配置する。
- **ap-northeast-1aのインスタンス(public-1-1)の場合**
- `sudo vi /opt/socat-portforwarding-1a.sh`コマンドを実行する。
- `ec2/gateway/<環境名>/public-1-1/socat-portforwarding-1a.sh`の内容をコピペして保存する。
- `sudo chmod 774 /opt/socat-portforwarding-1a.sh`コマンドを実行する。
- **ap-northeast-1dのインスタンス(public-2-1)の場合**
- `sudo vi /opt/socat-portforwarding-1d.sh`コマンドを実行する。
- `ec2/gateway/<環境名>/public-2-1/socat-portforwarding-1d.sh`の内容をコピペして保存する。
- `sudo chmod 774 /opt/socat-portforwarding-1d.sh`コマンドを実行する。
- 以下の操作を実行し、`socat`プロセスを常駐させるためのサービス設定ファイルを配置する。
- **ap-northeast-1aのインスタンス(public-1-1)の場合**
- `sudo vi /etc/systemd/system/socat-dbconnection-1a.service`コマンドを実行する。
- `ec2/gateway/<環境名>/public-1-1/socat-dbconnection-1a.service`の内容をコピペして保存する。
- `sudo chmod 774 /etc/systemd/system/socat-dbconnection-1a.service`コマンドを実行する。
- **ap-northeast-1dのインスタンス(public-2-1)の場合**
- `sudo vi /etc/systemd/system/socat-dbconnection-1d.service`コマンドを実行する。
- `ec2/gateway/<環境名>/public-2-1/socat-dbconnection-1d.service`の内容をコピペして保存する。
- `sudo chmod 774 /etc/systemd/system/socat-dbconnection-1d.service`コマンドを実行する。
- 以下の操作を実行し、`socat`プロセスを起動するスクリプトをsystemdに登録する
- **ap-northeast-1aのインスタンス(public-1-1)の場合**
- `sudo systemctl enable socat-dbconnection-1a.service`コマンドを実行し、サービスを有効化する。
- `sudo systemctl status socat-dbconnection-1a.service`コマンドを実行し、サービスのステータスを確認する。`enabled;`となっていればOK
- **ap-northeast-1dのインスタンス(public-2-1)の場合**
- `sudo systemctl enable socat-dbconnection-1d.service`コマンドを実行し、サービスを有効化する。
- `sudo systemctl status socat-dbconnection-1d.service`コマンドを実行し、サービスのステータスを確認する。`enabled;`となっていればOK

View File

@ -0,0 +1,10 @@
[Unit]
Description = socat-dbconnection-1a
[Service]
ExecStart = /opt/socat-portforwarding-1a.sh
Type = oneshot
RemainAfterExit = yes
[Install]
WantedBy = default.target

View File

@ -0,0 +1,3 @@
#!/bin/bash
socat tcp4-listen:40001,reuseaddr,fork TCP:mbj-newdwh2021-product-dbcluster-instance-1.chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &
socat tcp4-listen:50001,reuseaddr,fork TCP:mbj-newdwh2021-product-dbcluster.cluster-chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &

View File

@ -0,0 +1,10 @@
[Unit]
Description = socat-dbconnection-1d
[Service]
ExecStart = /opt/socat-portforwarding-1d.sh
Type = oneshot
RemainAfterExit = yes
[Install]
WantedBy = default.target

View File

@ -0,0 +1,3 @@
#!/bin/bash
socat tcp4-listen:40001,reuseaddr,fork TCP:mbj-newdwh2021-product-dbcluster-instance-1-ap-northeast-1d.chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &
socat tcp4-listen:50001,reuseaddr,fork TCP:mbj-newdwh2021-product-dbcluster.cluster-chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &

View File

@ -0,0 +1,10 @@
[Unit]
Description = socat-dbconnection-1a
[Service]
ExecStart = /opt/socat-portforwarding-1a.sh
Type = oneshot
RemainAfterExit = yes
[Install]
WantedBy = default.target

View File

@ -0,0 +1,3 @@
#!/bin/bash
socat tcp4-listen:40001,reuseaddr,fork TCP:mbj-newdwh2021-staging-dbcluster-instance-1.chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &
socat tcp4-listen:50001,reuseaddr,fork TCP:mbj-newdwh2021-staging-dbcluster.cluster-chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &

View File

@ -0,0 +1,10 @@
[Unit]
Description = socat-dbconnection-1d
[Service]
ExecStart = /opt/socat-portforwarding-1d.sh
Type = oneshot
RemainAfterExit = yes
[Install]
WantedBy = default.target

View File

@ -0,0 +1,3 @@
#!/bin/bash
socat tcp4-listen:40001,reuseaddr,fork TCP:mbj-newdwh2021-staging-dbcluster-instance-1-ap-northeast-1d.chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &
socat tcp4-listen:50001,reuseaddr,fork TCP:mbj-newdwh2021-staging-dbcluster.cluster-chs11qsgoyix.ap-northeast-1.rds.amazonaws.com:3306 &

20
ecs/crm-datafetch/.gitignore vendored Normal file
View File

@ -0,0 +1,20 @@
# Node.jsで実装されたLambdaの管理対象外ファイル群
package-lock.json
node_modules/
# ローカル確認用環境変数ファイル
.env
# Pythonの仮想環境ファイル
.venv
# pythonのキャッシュファイル
__pycache__/
# StepFunctionsステートメント定義変換後のフォルダ
stepfunctions/*/build
**/.vscode/settings.json
# python test
.coverage
.report/
# log
.log

View File

@ -1,15 +1,15 @@
FROM python:3.9 FROM python:3.12-slim-bookworm
ENV TZ="Asia/Tokyo" ENV TZ="Asia/Tokyo"
# pythonの標準出力をバッファリングしないフラグ
ENV PYTHONUNBUFFERED=1
# pythonのバイトコードを生成しないフラグ
ENV PYTHONDONTWRITEBYTECODE=1
WORKDIR /usr/src/app WORKDIR /usr/src/app
COPY Pipfile Pipfile.lock ./ COPY Pipfile Pipfile.lock ./
RUN \ RUN \
apt update -y && \ apt update -y && \
# パッケージのセキュリティアップデートのみを適用するコマンド
apt install -y unattended-upgrades && \
unattended-upgrades && \
pip install --upgrade pip wheel setuptools && \
pip install pipenv --no-cache-dir && \ pip install pipenv --no-cache-dir && \
pipenv install --system --deploy && \ pipenv install --system --deploy && \
pip uninstall -y pipenv virtualenv-clone virtualenv pip uninstall -y pipenv virtualenv-clone virtualenv

View File

@ -11,7 +11,7 @@ test = "pytest tests/"
[packages] [packages]
boto3 = "*" boto3 = "*"
simple-salesforce = "==1.12.4" simple-salesforce = "==1.12.6"
tenacity = "*" tenacity = "*"
[dev-packages] [dev-packages]
@ -23,4 +23,4 @@ pytest-html = "*"
moto = "*" moto = "*"
[requires] [requires]
python_version = "3.9" python_version = "3.12"

File diff suppressed because it is too large Load Diff

View File

@ -4,7 +4,7 @@
### ツールのバージョン ### ツールのバージョン
- Python 3.9.x - Python 3.12.x
- PipEnv(Pythonの依存関係管理用モジュール) - PipEnv(Pythonの依存関係管理用モジュール)
### 開発環境 ### 開発環境

View File

@ -12,87 +12,87 @@ from src.system_var.environments import (CRM_BACKUP_BUCKET, CRM_CONFIG_BUCKET,
RESPONSE_JSON_BACKUP_FOLDER) RESPONSE_JSON_BACKUP_FOLDER)
class S3Resource: class S3Client:
def __init__(self, bucket_name: str) -> None: def __init__(self, bucket_name: str) -> None:
self.__s3_resource = boto3.resource(AWS_RESOURCE_S3) self.__s3_client = boto3.client(AWS_RESOURCE_S3)
self.__s3_bucket = self.__s3_resource.Bucket(bucket_name) self.__s3_bucket = bucket_name
def get_object(self, object_key: str) -> str: def get_object(self, object_key: str) -> str:
response = self.__s3_bucket.Object(object_key).get() response = self.__s3_client.get_object(Bucket=self.__s3_bucket, Key=object_key)
body = response[S3_RESPONSE_BODY].read() body = response[S3_RESPONSE_BODY].read()
return body.decode(S3_CHAR_CODE) return body.decode(S3_CHAR_CODE)
def put_object(self, object_key: str, local_file_path: str) -> None: def put_object(self, object_key: str, local_file_path: str) -> None:
self.__s3_bucket.upload_file(Key=object_key, Filename=local_file_path) self.__s3_client.upload_file(Filename=local_file_path, Bucket=self.__s3_bucket, Key=object_key)
return return
def copy(self, src_bucket: str, src_key: str, dest_bucket: str, dest_key: str) -> None: def copy(self, src_bucket: str, src_key: str, dest_bucket: str, dest_key: str) -> None:
copy_source = {'Bucket': src_bucket, 'Key': src_key} copy_source = {'Bucket': src_bucket, 'Key': src_key}
self.__s3_resource.meta.client.copy(copy_source, dest_bucket, dest_key) self.__s3_client.copy_object(CopySource=copy_source, Bucket=dest_bucket, Key=dest_key)
return return
class ConfigBucket: class ConfigBucket:
__s3_resource: S3Resource = None __s3_client: S3Client = None
def __init__(self) -> None: def __init__(self) -> None:
self.__s3_resource = S3Resource(CRM_CONFIG_BUCKET) self.__s3_client = S3Client(CRM_CONFIG_BUCKET)
def __str__(self) -> str: def __str__(self) -> str:
return CRM_CONFIG_BUCKET return CRM_CONFIG_BUCKET
def get_object_info_file(self) -> str: def get_object_info_file(self) -> str:
return self.__s3_resource.get_object(f'{OBJECT_INFO_FOLDER}/{OBJECT_INFO_FILENAME}') return self.__s3_client.get_object(f'{OBJECT_INFO_FOLDER}/{OBJECT_INFO_FILENAME}')
def get_last_fetch_datetime_file(self, file_key: str) -> str: def get_last_fetch_datetime_file(self, file_key: str) -> str:
return self.__s3_resource.get_object(f'{LAST_FETCH_DATE_FOLDER}/{file_key}') return self.__s3_client.get_object(f'{LAST_FETCH_DATE_FOLDER}/{file_key}')
def put_last_fetch_datetime_file(self, file_key: str, local_file_path: str) -> None: def put_last_fetch_datetime_file(self, file_key: str, local_file_path: str) -> None:
self.__s3_resource.put_object( self.__s3_client.put_object(
f'{LAST_FETCH_DATE_FOLDER}/{file_key}', local_file_path) f'{LAST_FETCH_DATE_FOLDER}/{file_key}', local_file_path)
return return
class DataBucket: class DataBucket:
__s3_resource: S3Resource = None __s3_client: S3Client = None
def __init__(self) -> None: def __init__(self) -> None:
self.__s3_resource = S3Resource(IMPORT_DATA_BUCKET) self.__s3_client = S3Client(IMPORT_DATA_BUCKET)
def __str__(self) -> str: def __str__(self) -> str:
return IMPORT_DATA_BUCKET return IMPORT_DATA_BUCKET
def put_csv(self, file_key: str, local_file_path: str) -> None: def put_csv(self, file_key: str, local_file_path: str) -> None:
object_key = f'{CRM_IMPORT_DATA_FOLDER}/{file_key}' object_key = f'{CRM_IMPORT_DATA_FOLDER}/{file_key}'
self.__s3_resource.put_object(object_key, local_file_path) self.__s3_client.put_object(object_key, local_file_path)
return return
def put_csv_from(self, src_bucket: str, src_key: str): def put_csv_from(self, src_bucket: str, src_key: str):
dest_filename = src_key.split('/')[-1] dest_filename = src_key.split('/')[-1]
self.__s3_resource.copy(src_bucket, src_key, str(self), f'{CRM_IMPORT_DATA_FOLDER}/{dest_filename}') self.__s3_client.copy(src_bucket, src_key, str(self), f'{CRM_IMPORT_DATA_FOLDER}/{dest_filename}')
return return
class BackupBucket: class BackupBucket:
__s3_resource: S3Resource = None __s3_client: S3Client = None
def __init__(self) -> None: def __init__(self) -> None:
self.__s3_resource = S3Resource(CRM_BACKUP_BUCKET) self.__s3_client = S3Client(CRM_BACKUP_BUCKET)
def __str__(self) -> str: def __str__(self) -> str:
return CRM_BACKUP_BUCKET return CRM_BACKUP_BUCKET
def put_response_json(self, file_key: str, local_file_path: str) -> None: def put_response_json(self, file_key: str, local_file_path: str) -> None:
object_key = f'{RESPONSE_JSON_BACKUP_FOLDER}/{file_key}' object_key = f'{RESPONSE_JSON_BACKUP_FOLDER}/{file_key}'
self.__s3_resource.put_object(object_key, local_file_path) self.__s3_client.put_object(object_key, local_file_path)
return return
def put_csv(self, file_key: str, local_file_path: str) -> None: def put_csv(self, file_key: str, local_file_path: str) -> None:
object_key = f'{CRM_IMPORT_DATA_BACKUP_FOLDER}/{file_key}' object_key = f'{CRM_IMPORT_DATA_BACKUP_FOLDER}/{file_key}'
self.__s3_resource.put_object(object_key, local_file_path) self.__s3_client.put_object(object_key, local_file_path)
return return
def put_result_json(self, file_key: str, local_file_path: str) -> None: def put_result_json(self, file_key: str, local_file_path: str) -> None:
object_key = f'{PROCESS_RESULT_FOLDER}/{file_key}' object_key = f'{PROCESS_RESULT_FOLDER}/{file_key}'
self.__s3_resource.put_object(object_key, local_file_path) self.__s3_client.put_object(object_key, local_file_path)
return return

View File

@ -1,7 +1,8 @@
import os import os
import pytest import pytest
from src.aws.s3 import BackupBucket, ConfigBucket, DataBucket, S3Resource
from src.aws.s3 import BackupBucket, ConfigBucket, DataBucket, S3Client
@pytest.fixture @pytest.fixture
@ -15,7 +16,7 @@ def s3_test(s3_client, bucket_name):
yield yield
class TestS3Resource: class TestS3Client:
def test_get_object(self, s3_test, s3_client, bucket_name): def test_get_object(self, s3_test, s3_client, bucket_name):
""" """
@ -31,7 +32,7 @@ class TestS3Resource:
s3_client.put_object(Bucket=bucket_name, Key='hogehoge/test.txt', Body=b'aaaaaaaaaaaaaaa') s3_client.put_object(Bucket=bucket_name, Key='hogehoge/test.txt', Body=b'aaaaaaaaaaaaaaa')
# Act # Act
sut = S3Resource(bucket_name) sut = S3Client(bucket_name)
actual = sut.get_object('hogehoge/test.txt') actual = sut.get_object('hogehoge/test.txt')
# Assert # Assert
@ -48,7 +49,7 @@ class TestS3Resource:
""" """
# Arrange # Arrange
# Act # Act
sut = S3Resource(bucket_name) sut = S3Client(bucket_name)
with pytest.raises(Exception): with pytest.raises(Exception):
# Assert # Assert
sut.get_object('hogehoge/test.txt') sut.get_object('hogehoge/test.txt')
@ -68,7 +69,7 @@ class TestS3Resource:
with open(file_path, mode='w') as f: with open(file_path, mode='w') as f:
f.write('aaaaaaaaaaaaaaa') f.write('aaaaaaaaaaaaaaa')
sut = S3Resource(bucket_name) sut = S3Client(bucket_name)
sut.put_object('hogehoge/test.txt', file_path) sut.put_object('hogehoge/test.txt', file_path)
actual = s3_client.get_object(Bucket=bucket_name, Key='hogehoge/test.txt') actual = s3_client.get_object(Bucket=bucket_name, Key='hogehoge/test.txt')
@ -87,7 +88,7 @@ class TestS3Resource:
""" """
# Arrange # Arrange
# Act # Act
sut = S3Resource(bucket_name) sut = S3Client(bucket_name)
with pytest.raises(Exception): with pytest.raises(Exception):
# Assert # Assert
sut.put_object('hogehoge/test.txt', 'aaaaaaaaaaaaaaa') sut.put_object('hogehoge/test.txt', 'aaaaaaaaaaaaaaa')
@ -108,7 +109,7 @@ class TestS3Resource:
s3_client.create_bucket(Bucket=for_copy_bucket) s3_client.create_bucket(Bucket=for_copy_bucket)
s3_client.put_object(Bucket=bucket_name, Key='hogehoge/test.txt', Body=b'aaaaaaaaaaaaaaa') s3_client.put_object(Bucket=bucket_name, Key='hogehoge/test.txt', Body=b'aaaaaaaaaaaaaaa')
sut = S3Resource(bucket_name) sut = S3Client(bucket_name)
sut.copy(bucket_name, 'hogehoge/test.txt', for_copy_bucket, 'test.txt') sut.copy(bucket_name, 'hogehoge/test.txt', for_copy_bucket, 'test.txt')
actual = s3_client.get_object(Bucket=for_copy_bucket, Key='test.txt') actual = s3_client.get_object(Bucket=for_copy_bucket, Key='test.txt')
@ -125,7 +126,7 @@ class TestS3Resource:
""" """
# Arrange # Arrange
# Act # Act
sut = S3Resource(bucket_name) sut = S3Client(bucket_name)
with pytest.raises(Exception): with pytest.raises(Exception):
# Assert # Assert
sut.copy(bucket_name, 'hogehoge/test.txt', 'for_copy_bucket', 'test.txt') sut.copy(bucket_name, 'hogehoge/test.txt', 'for_copy_bucket', 'test.txt')
@ -140,8 +141,8 @@ class TestS3Resource:
- インスタンス生成時に例外が発生すること - インスタンス生成時に例外が発生すること
""" """
with pytest.raises(Exception) as e: with pytest.raises(Exception) as e:
S3Resource() S3Client()
assert e.value.args[0] == "__init__() missing 1 required positional argument: 'bucket_name'" assert e.value.args[0] == "S3Client.__init__() missing 1 required positional argument: 'bucket_name'"
class TestConfigBucket: class TestConfigBucket:

View File

@ -4,8 +4,7 @@ from datetime import datetime
import boto3 import boto3
import pytest import pytest
from moto import mock_s3 from moto import mock_aws
from py.xml import html # type: ignore
from . import docstring_parser from . import docstring_parser
@ -21,7 +20,7 @@ def aws_credentials():
@pytest.fixture @pytest.fixture
def s3_client(aws_credentials): def s3_client(aws_credentials):
with mock_s3(): with mock_aws():
conn = boto3.client("s3", region_name="us-east-1") conn = boto3.client("s3", region_name="us-east-1")
yield conn yield conn
@ -35,18 +34,18 @@ def pytest_html_report_title(report):
def pytest_html_results_table_header(cells): def pytest_html_results_table_header(cells):
del cells[2:] del cells[2:]
cells.insert(3, html.th("Cases")) cells.insert(3, '<th>Cases</th>')
cells.insert(4, html.th("Arranges")) cells.insert(4, '<th>Arranges</th>')
cells.insert(5, html.th("Expects")) cells.insert(5, '<th>Expects</th>')
cells.append(html.th("Time", class_="sortable time", col="time")) cells.append('<th class="sortable time" col="time">Time</th>')
def pytest_html_results_table_row(report, cells): def pytest_html_results_table_row(report, cells):
del cells[2:] del cells[2:]
cells.insert(3, html.td(html.pre(report.cases))) # 「テスト内容」をレポートに出力 cells.insert(3, f'<td><pre>{report.cases}</pre></td>') # 「テスト内容」をレポートに出力
cells.insert(4, html.td(html.pre(report.arranges))) # 「期待結果」をレポートに出力 cells.insert(4, f'<td><pre>{report.arranges}</pre></td>') # 「期待結果」をレポートに出力
cells.insert(5, html.td(html.pre(report.expects))) # 「期待結果」をレポートに出力 cells.insert(5, f'<td><pre>{report.expects}</pre></td>') # 「期待結果」をレポートに出力
cells.append(html.td(datetime.now(), class_="col-time")) # ついでに「時間」もレポートに出力 cells.append(f'<td class="col-time">{datetime.now()}</td>') # ついでに「時間」もレポートに出力
@pytest.hookimpl(hookwrapper=True) @pytest.hookimpl(hookwrapper=True)

View File

@ -652,7 +652,8 @@ class TestSalesforceApiClient:
actual = sut.fetch_sf_data(soql) actual = sut.fetch_sf_data(soql)
assert len(actual) > 0 assert len(actual) > 0
assert dict(actual[0])["RelationshipTest__r"]["RecordType"]["DeveloperName"] == "RecordTypeSpecial" print(dict(actual[0]))
assert dict(actual[0])["RelationshipTest__r"]["RecordType"]["DeveloperName"] == "RecordTypeNormal"
def test_raise_create_instance_cause_auth_failed(self, monkeypatch): def test_raise_create_instance_cause_auth_failed(self, monkeypatch):
""" """

View File

@ -6,6 +6,7 @@ from datetime import datetime, timezone
import boto3 import boto3
import pytest import pytest
from src.controller import controller from src.controller import controller
from src.parser.json_parser import JsonParser from src.parser.json_parser import JsonParser
from src.system_var.constants import YYYYMMDDTHHMMSSTZ from src.system_var.constants import YYYYMMDDTHHMMSSTZ
@ -114,6 +115,10 @@ def test_walk_through(s3_test, s3_client, monkeypatch, caplog):
logger.info(f'##########################') logger.info(f'##########################')
# Assertion # Assertion
log_messages = caplog.messages log_messages = caplog.messages
# ログの目視確認を容易にするため、ローカルファイルに書き出す。
with open('crm_datafetch_test_walk_through_diff.log', 'w', encoding='utf8') as f:
f.write('\n'.join(log_messages))
# ループ前のログ確認 # ループ前のログ確認
assert 'I-CTRL-01 CRMデータ取得処理を開始します' in log_messages assert 'I-CTRL-01 CRMデータ取得処理を開始します' in log_messages
assert 'I-CTRL-02 データ取得準備処理呼び出し' in log_messages assert 'I-CTRL-02 データ取得準備処理呼び出し' in log_messages
@ -170,6 +175,10 @@ def test_walk_through(s3_test, s3_client, monkeypatch, caplog):
logger.info(f'##########################') logger.info(f'##########################')
# ログ再取得 # ログ再取得
log_messages_all = caplog.messages log_messages_all = caplog.messages
# ログの目視確認を容易にするため、ローカルファイルに書き出す。
with open('crm_datafetch_test_walk_through_all.log', 'w', encoding='utf8') as f:
f.write('\n'.join(log_messages_all))
object_info_list_all = object_info_files[1] object_info_list_all = object_info_files[1]
# 開始ログなどはテスト済みなのでチェックを省く # 開始ログなどはテスト済みなのでチェックを省く
for object_info in object_info_list_all['objects']: for object_info in object_info_list_all['objects']:

View File

@ -99,8 +99,9 @@ class TestCounterObject:
sut = CounterObject() sut = CounterObject()
sut.describe(1) sut.describe(1)
print(str(e.value))
# Expects # Expects
assert str(e.value) == 'describe() takes 1 positional argument but 2 were given' assert str(e.value) == 'CounterObject.describe() takes 1 positional argument but 2 were given'
def test_increment(self) -> int: def test_increment(self) -> int:
""" """

View File

@ -1,16 +1,19 @@
FROM python:3.9 FROM python:3.12-slim-bookworm
ENV TZ="Asia/Tokyo" ENV TZ="Asia/Tokyo"
# pythonの標準出力をバッファリングしないフラグ
ENV PYTHONUNBUFFERED=1
# pythonのバイトコードを生成しないフラグ
ENV PYTHONDONTWRITEBYTECODE=1
WORKDIR /usr/src/app WORKDIR /usr/src/app
COPY requirements.txt ./ COPY Pipfile Pipfile.lock ./
RUN \ RUN \
apt update -y && \ apt update -y && \
# パッケージのセキュリティアップデートのみを適用するコマンド pip install pipenv --no-cache-dir && \
apt install -y unattended-upgrades && \ pipenv install --system --deploy && \
unattended-upgrades && \ pip uninstall -y pipenv virtualenv-clone virtualenv
pip install --upgrade pip wheel setuptools && \
pip install --no-cache-dir -r requirements.txt
COPY dataimport ./ COPY dataimport ./
CMD [ "python", "./controller.py" ] CMD [ "python", "./controller.py" ]

13
ecs/dataimport/Pipfile Normal file
View File

@ -0,0 +1,13 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
boto3 = "*"
pymysql = "*"
[dev-packages]
[requires]
python_version = "3.12"

87
ecs/dataimport/Pipfile.lock generated Normal file
View File

@ -0,0 +1,87 @@
{
"_meta": {
"hash": {
"sha256": "1738beec0de1a16f127d9bbeef1c9cb1ffb5b2377aa1aedbce9bfacae0fa1c67"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.12"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"boto3": {
"hashes": [
"sha256:3faa2c328a61745f3215a63039606a6fcf55d9afe1cc76e3a5e27b9db58cdbf6",
"sha256:b998edac72f6740bd5d9d585cf3880f2dfeb4842e626b34430fd0e9623378011"
],
"index": "pypi",
"markers": "python_version >= '3.9'",
"version": "==1.38.32"
},
"botocore": {
"hashes": [
"sha256:0899a090e352cb5eeaae2c7bb52a987b469d23912c7ece86664dfb5c2e074978",
"sha256:64ab919a5d8b74dd73eaac1f978d0e674d11ff3bbe8815c3d2982477be9a082c"
],
"markers": "python_version >= '3.9'",
"version": "==1.38.32"
},
"jmespath": {
"hashes": [
"sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980",
"sha256:90261b206d6defd58fdd5e85f478bf633a2901798906be2ad389150c5c60edbe"
],
"markers": "python_version >= '3.7'",
"version": "==1.0.1"
},
"pymysql": {
"hashes": [
"sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c",
"sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0"
],
"index": "pypi",
"markers": "python_version >= '3.7'",
"version": "==1.1.1"
},
"python-dateutil": {
"hashes": [
"sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3",
"sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'",
"version": "==2.9.0.post0"
},
"s3transfer": {
"hashes": [
"sha256:0148ef34d6dd964d0d8cf4311b2b21c474693e57c2e069ec708ce043d2b527be",
"sha256:f5e6db74eb7776a37208001113ea7aa97695368242b364d73e91c981ac522177"
],
"markers": "python_version >= '3.9'",
"version": "==0.13.0"
},
"six": {
"hashes": [
"sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274",
"sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'",
"version": "==1.17.0"
},
"urllib3": {
"hashes": [
"sha256:414bc6535b787febd7567804cc015fee39daab8ad86268f1310a9250697de466",
"sha256:4e16665048960a0900c702d4a66415956a584919c03361cac9f1df5c5dd7e813"
],
"markers": "python_version >= '3.9'",
"version": "==2.4.0"
}
},
"develop": {}
}

View File

@ -4,7 +4,6 @@ import sys
from datetime import datetime from datetime import datetime
import boto3 import boto3
from common import convert_quotechar, debug_log from common import convert_quotechar, debug_log
from end import end from end import end
from error import error from error import error
@ -41,7 +40,7 @@ LINE_FEED_CODE = {
} }
# クラス変数 # クラス変数
s3_resource = boto3.resource('s3') s3_client = boto3.client('s3')
# チェック例外クラス # チェック例外クラス
@ -74,16 +73,14 @@ def check(bucket_name, target_data_source, target_file_name, settings_key, log_i
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-CHK-01 - チェック処理を開始します') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-CHK-01 - チェック処理を開始します')
# データ読込 # データ読込
settings_obj = s3_resource.Object(bucket_name, settings_key) settings_obj_response = s3_client.get_object(Bucket=bucket_name, Key=settings_key)
settings_response = settings_obj.get()
settings_list = [] settings_list = []
for line in io.TextIOWrapper(io.BytesIO(settings_response["Body"].read()), encoding='utf-8'): for line in io.TextIOWrapper(io.BytesIO(settings_obj_response["Body"].read()), encoding='utf-8'):
settings_list.append(line.rstrip('\n')) settings_list.append(line.rstrip('\n'))
work_key = target_data_source + DIRECTORY_WORK + target_file_name work_key = target_data_source + DIRECTORY_WORK + target_file_name
work_obj = s3_resource.Object(bucket_name, work_key) work_obj_response = s3_client.get_object(Bucket=bucket_name, Key=work_key)
work_response = work_obj.get() work_data = io.TextIOWrapper(io.BytesIO(work_obj_response["Body"].read()), encoding=settings_list[SETTINGS_ITEM["charCode"]], newline=LINE_FEED_CODE[settings_list[SETTINGS_ITEM["lineFeedCode"]]])
work_data = io.TextIOWrapper(io.BytesIO(work_response["Body"].read()), encoding=settings_list[SETTINGS_ITEM["charCode"]], newline=LINE_FEED_CODE[settings_list[SETTINGS_ITEM["lineFeedCode"]]])
work_csv_row = [] work_csv_row = []
for i, line in enumerate(csv.reader(work_data, quotechar=convert_quotechar(settings_list[SETTINGS_ITEM["quotechar"]]), delimiter=settings_list[SETTINGS_ITEM["delimiter"]])): for i, line in enumerate(csv.reader(work_data, quotechar=convert_quotechar(settings_list[SETTINGS_ITEM["quotechar"]]), delimiter=settings_list[SETTINGS_ITEM["delimiter"]])):
# ヘッダあり、かつ、1行目の場合 # ヘッダあり、かつ、1行目の場合
@ -148,3 +145,16 @@ def is_empty_file(work_csv_row: list, settings_list: list):
return len(work_csv_row[1:]) == 0 return len(work_csv_row[1:]) == 0
return len(work_csv_row) == 0 return len(work_csv_row) == 0
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# check(
# bucket_name='バケット名',
# target_data_source='データソース名',
# target_file_name='targetフォルダ内のファイル名',
# settings_key='個別設定ファイル名',
# log_info='Info',
# mode='i'
# )

View File

@ -1,7 +1,8 @@
from datetime import datetime from datetime import datetime
import boto3 import boto3
from error import error
from common import debug_log from common import debug_log
from error import error
# 定数 # 定数
LOG_LEVEL = {'i': 'Info', 'e': 'Error'} LOG_LEVEL = {'i': 'Info', 'e': 'Error'}
@ -12,7 +13,6 @@ DIRECTORY_WARNING = '/warning/'
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
def end(bucket_name, target_data_source, target_file_name, warning_info, log_info, mode): def end(bucket_name, target_data_source, target_file_name, warning_info, log_info, mode):
@ -45,8 +45,7 @@ def end(bucket_name, target_data_source, target_file_name, warning_info, log_inf
} }
done_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}' done_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}'
done_key = target_data_source + DIRECTORY_DONE + done_file_name done_key = target_data_source + DIRECTORY_DONE + done_file_name
done_obj = s3_resource.Object(bucket_name, done_key) s3_client.copy(CopySource=copy_source, Bucket=bucket_name, Key=done_key)
done_obj.copy(copy_source)
s3_client.delete_object(Bucket=bucket_name, Key=work_key) s3_client.delete_object(Bucket=bucket_name, Key=work_key)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-02 - workディレクトリの {target_file_name} をdoneディレクトリに移動しました 移動後ファイル名:{done_file_name}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-02 - workディレクトリの {target_file_name} をdoneディレクトリに移動しました 移動後ファイル名:{done_file_name}')
@ -64,23 +63,20 @@ def end(bucket_name, target_data_source, target_file_name, warning_info, log_inf
# warningファイルの作成 # warningファイルの作成
warning_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}_war.log' warning_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}_war.log'
warning_key = target_data_source + DIRECTORY_WARNING + warning_file_name warning_key = target_data_source + DIRECTORY_WARNING + warning_file_name
warning_obj = s3_resource.Object(bucket_name, warning_key) s3_client.put_object(Bucket=bucket_name, Key=warning_key, Body=bytes(warning_info, 'utf-8'))
warning_obj.put(Body=warning_info)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-06 - warningディレクトリに {warning_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-06 - warningディレクトリに {warning_file_name} を作成しました')
# warning処理結果ファイルの作成 # warning処理結果ファイルの作成
result_warning_file_name = target_file_name + '.warning' result_warning_file_name = target_file_name + '.warning'
result_warning_key = target_data_source + DIRECTORY_TARGET + result_warning_file_name result_warning_key = target_data_source + DIRECTORY_TARGET + result_warning_file_name
result_warning_obj = s3_resource.Object(bucket_name, result_warning_key) s3_client.put_object(Bucket=bucket_name, Key=result_warning_key, Body=b'')
result_warning_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-07 - targetディレクトリに {result_warning_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-07 - targetディレクトリに {result_warning_file_name} を作成しました')
else: else:
# done処理結果ファイルの作成 # done処理結果ファイルの作成
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-08 - Warning情報は存在しませんでした') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-08 - Warning情報は存在しませんでした')
result_done_file_name = target_file_name + '.done' result_done_file_name = target_file_name + '.done'
result_done_key = target_data_source + DIRECTORY_TARGET + result_done_file_name result_done_key = target_data_source + DIRECTORY_TARGET + result_done_file_name
result_done_obj = s3_resource.Object(bucket_name, result_done_key) s3_client.put_object(Bucket=bucket_name, Key=result_done_key, Body=b'')
result_done_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-09 - targetディレクトリに {result_done_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-END-09 - targetディレクトリに {result_done_file_name} を作成しました')
# ⑤ 終了処理終了ログを出力する # ⑤ 終了処理終了ログを出力する
@ -88,3 +84,17 @@ def end(bucket_name, target_data_source, target_file_name, warning_info, log_inf
except Exception as e: except Exception as e:
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-END-99 - エラー内容:{e}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-END-99 - エラー内容:{e}')
error(bucket_name, target_data_source, target_file_name, log_info) error(bucket_name, target_data_source, target_file_name, log_info)
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# end(
# bucket_name='バケット名',
# target_data_source='データソース名',
# target_file_name='targetフォルダ内のファイル',
# # warning_info='ワーニング内容', # ワーニングがある場合のテストはこちらを生かす
# warning_info='',
# log_info='Info',
# mode='i'
# )

View File

@ -1,6 +1,7 @@
from datetime import datetime
import boto3
import sys import sys
from datetime import datetime
import boto3
# 定数 # 定数
LOG_LEVEL = {'i': 'Info', 'e': 'Error'} LOG_LEVEL = {'i': 'Info', 'e': 'Error'}
@ -10,7 +11,6 @@ DIRECTORY_ERROR = '/error/'
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
def error(bucket_name, target_data_source, target_file_name, log_info): def error(bucket_name, target_data_source, target_file_name, log_info):
@ -34,8 +34,7 @@ def error(bucket_name, target_data_source, target_file_name, log_info):
} }
error_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}' error_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}'
error_key = target_data_source + DIRECTORY_ERROR + error_file_name error_key = target_data_source + DIRECTORY_ERROR + error_file_name
error_obj = s3_resource.Object(bucket_name, error_key) s3_client.copy(CopySource=copy_source, Bucket=bucket_name, Key=error_key)
error_obj.copy(copy_source)
s3_client.delete_object(Bucket=bucket_name, Key=work_key) s3_client.delete_object(Bucket=bucket_name, Key=work_key)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-02 - workディレクトリの {target_file_name} をerrorディレクトリに移動しました 移動後ファイル名:{error_file_name}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-02 - workディレクトリの {target_file_name} をerrorディレクトリに移動しました 移動後ファイル名:{error_file_name}')
@ -48,8 +47,7 @@ def error(bucket_name, target_data_source, target_file_name, log_info):
# ④ S3バケット内のtargetディレクトリに、「投入データファイル名.error」ファイルを作成する # ④ S3バケット内のtargetディレクトリに、「投入データファイル名.error」ファイルを作成する
result_error_file_name = target_file_name + '.error' result_error_file_name = target_file_name + '.error'
result_error_key = target_data_source + DIRECTORY_TARGET + result_error_file_name result_error_key = target_data_source + DIRECTORY_TARGET + result_error_file_name
result_error_obj = s3_resource.Object(bucket_name, result_error_key) s3_client.put_object(Bucket=bucket_name, Key=result_error_key, Body=b'')
result_error_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-04 - targetディレクトリに {result_error_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-04 - targetディレクトリに {result_error_file_name} を作成しました')
except Exception as e: except Exception as e:
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-ERR-99 - エラー内容:{e}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-ERR-99 - エラー内容:{e}')
@ -81,16 +79,14 @@ def error_doing_file_exists(bucket_name, target_key, target_data_source, target_
} }
error_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}' error_file_name = f'{datetime.now():%Y%m%d%H%M%S}_{target_file_name}'
error_key = target_data_source + DIRECTORY_ERROR + error_file_name error_key = target_data_source + DIRECTORY_ERROR + error_file_name
error_obj = s3_resource.Object(bucket_name, error_key) s3_client.copy(CopySource=copy_source, Bucket=bucket_name, Key=error_key)
error_obj.copy(copy_source)
s3_client.delete_object(Bucket=bucket_name, Key=target_key) s3_client.delete_object(Bucket=bucket_name, Key=target_key)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-07 - targetディレクトリの {target_file_name} をerrorディレクトリに移動しました 移動後ファイル名:{error_file_name}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-07 - targetディレクトリの {target_file_name} をerrorディレクトリに移動しました 移動後ファイル名:{error_file_name}')
# ③ S3バケット内のtargetディレクトリに、「投入データファイル名.exclusive_error」ファイルを作成する # ③ S3バケット内のtargetディレクトリに、「投入データファイル名.exclusive_error」ファイルを作成する
result_error_file_name = target_file_name + '.exclusive_error' result_error_file_name = target_file_name + '.exclusive_error'
result_error_key = target_data_source + DIRECTORY_TARGET + result_error_file_name result_error_key = target_data_source + DIRECTORY_TARGET + result_error_file_name
result_error_obj = s3_resource.Object(bucket_name, result_error_key) s3_client.put_object(Bucket=bucket_name, Key=result_error_key, Body=b'')
result_error_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-08 - targetディレクトリに {result_error_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-ERR-08 - targetディレクトリに {result_error_file_name} を作成しました')
except Exception as e: except Exception as e:
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-ERR-99 - エラー内容:{e}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-ERR-99 - エラー内容:{e}')
@ -100,3 +96,24 @@ def error_doing_file_exists(bucket_name, target_key, target_data_source, target_
# ⑤ 処理を終了する # ⑤ 処理を終了する
sys.exit() sys.exit()
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# エラー処理
# error(
# bucket_name='バケット名',
# target_data_source='データソース名',
# target_file_name='データソース名/target/ファイル名',
# log_info='Info'
# )
# doingファイルの有無チェック関数
# error_doing_file_exists(
# bucket_name='バケット名',
# target_key='投入データのフルパス',
# target_data_source='投入データのディレクトリ名よりデータソースに該当する部分',
# target_file_name='投入データのファイル名',
# log_info='Info'
# )

View File

@ -1,12 +1,12 @@
from datetime import datetime
import boto3
import io
import csv import csv
import io
import re import re
import sys import sys
from error import error from datetime import datetime
from error import error_doing_file_exists
import boto3
from common import debug_log from common import debug_log
from error import error, error_doing_file_exists
# 定数 # 定数
LOG_LEVEL = {"i": 'Info', "e": 'Error'} LOG_LEVEL = {"i": 'Info', "e": 'Error'}
@ -17,7 +17,6 @@ DIRECTORY_SETTINGS = '/settings/'
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
def init(bucket_name, target_key, target_data_source, target_file_name, log_info, mode): def init(bucket_name, target_key, target_data_source, target_file_name, log_info, mode):
@ -60,8 +59,7 @@ def init(bucket_name, target_key, target_data_source, target_file_name, log_info
try: try:
# ③ S3バケット内のtargetディレクトリに、「投入データファイル名.doing」ファイルを作成する # ③ S3バケット内のtargetディレクトリに、「投入データファイル名.doing」ファイルを作成する
doing_obj = s3_resource.Object(bucket_name, doing_key) s3_client.put_object(Bucket=bucket_name, Key=doing_key, Body=b'')
doing_obj.put(Body='')
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-04 - targetディレクトリに {doing_file_name} を作成しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-04 - targetディレクトリに {doing_file_name} を作成しました')
# ④ 投入データファイルをS3バケット内のtargetディレクトリから、workディレクトリに移動(コピー削除)する # ④ 投入データファイルをS3バケット内のtargetディレクトリから、workディレクトリに移動(コピー削除)する
@ -70,8 +68,7 @@ def init(bucket_name, target_key, target_data_source, target_file_name, log_info
'Key': target_key 'Key': target_key
} }
work_key = target_data_source + DIRECTORY_WORK + target_file_name work_key = target_data_source + DIRECTORY_WORK + target_file_name
work_obj = s3_resource.Object(bucket_name, work_key) s3_client.copy(CopySource=copy_source, Bucket=bucket_name, Key=work_key)
work_obj.copy(copy_source)
s3_client.delete_object(Bucket=bucket_name, Key=target_key) s3_client.delete_object(Bucket=bucket_name, Key=target_key)
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-05 - 投入データ {target_file_name} をworkディレクトリに移動しました') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-05 - 投入データ {target_file_name} をworkディレクトリに移動しました')
except Exception as e: except Exception as e:
@ -122,9 +119,8 @@ def init(bucket_name, target_key, target_data_source, target_file_name, log_info
try: try:
# ⑦ 個別設定ファイルを特定する # ⑦ 個別設定ファイルを特定する
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-17 - 個別設定ファイルを検索します') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-INI-17 - 個別設定ファイルを検索します')
mapping_obj = s3_resource.Object(bucket_name, mapping_key) mapping_obj_response = s3_client.get_object(Bucket=bucket_name, Key=mapping_key)
mapping_response = mapping_obj.get() mapping_body = io.TextIOWrapper(io.BytesIO(mapping_obj_response["Body"].read()), encoding='utf-8')
mapping_body = io.TextIOWrapper(io.BytesIO(mapping_response["Body"].read()), encoding='utf-8')
settings_file_name = '' settings_file_name = ''
for row in csv.reader(mapping_body, delimiter='\t'): for row in csv.reader(mapping_body, delimiter='\t'):
if row: if row:
@ -159,3 +155,15 @@ def init(bucket_name, target_key, target_data_source, target_file_name, log_info
except Exception as e: except Exception as e:
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-INI-99 - エラー内容:{e}') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["e"]} E-INI-99 - エラー内容:{e}')
error(bucket_name, target_data_source, target_file_name, log_info) error(bucket_name, target_data_source, target_file_name, log_info)
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# init(
# bucket_name='バケット名',
# target_key='データソース名/target/ファイル名',
# target_data_source='データソース名',
# target_file_name='ファイル名',
# log_info='Info',
# mode='i'
# )

View File

@ -5,10 +5,9 @@ from datetime import datetime
import boto3 import boto3
import pymysql import pymysql
from pymysql.constants import CLIENT
from common import convert_quotechar, debug_log from common import convert_quotechar, debug_log
from error import error from error import error
from pymysql.constants import CLIENT
# 定数 # 定数
DIRECTORY_WORK = '/work/' DIRECTORY_WORK = '/work/'
@ -47,7 +46,6 @@ INVALID_CONFIG_EXCEPTION_MESSAGE = f'個別設定ファイルのインポート
# クラス変数 # クラス変数
s3_client = boto3.client('s3') s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
def main(bucket_name, target_data_source, target_file_name, settings_key, db_info, log_info, mode): def main(bucket_name, target_data_source, target_file_name, settings_key, db_info, log_info, mode):
@ -91,8 +89,7 @@ def main(bucket_name, target_data_source, target_file_name, settings_key, db_inf
# ④ 個別設定ファイルのロードスキーマのテーブル名に記載されているテーブルをTRUNCATEする # ④ 個別設定ファイルのロードスキーマのテーブル名に記載されているテーブルをTRUNCATEする
# 個別設定ファイルの読み込み # 個別設定ファイルの読み込み
settings_obj = s3_resource.Object(bucket_name, settings_key) settings_response = s3_client.get_object(Bucket=bucket_name, Key=settings_key)
settings_response = settings_obj.get()
settings_list = [] settings_list = []
for line in io.TextIOWrapper(io.BytesIO(settings_response["Body"].read()), encoding='utf-8'): for line in io.TextIOWrapper(io.BytesIO(settings_response["Body"].read()), encoding='utf-8'):
settings_list.append(line.rstrip('\n')) settings_list.append(line.rstrip('\n'))
@ -110,8 +107,7 @@ def main(bucket_name, target_data_source, target_file_name, settings_key, db_inf
# ⑤ 投入データファイルを1行ごとにループする # ⑤ 投入データファイルを1行ごとにループする
print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-MAIN-05 - 投入データ {target_file_name} の読み込みを開始します') print(f'{datetime.now():%Y-%m-%d %H:%M:%S} {log_info} {LOG_LEVEL["i"]} I-MAIN-05 - 投入データ {target_file_name} の読み込みを開始します')
work_key = target_data_source + DIRECTORY_WORK + target_file_name work_key = target_data_source + DIRECTORY_WORK + target_file_name
work_obj = s3_resource.Object(bucket_name, work_key) work_response = s3_client.get_object(Bucket=bucket_name, Key=work_key)
work_response = work_obj.get()
work_data = io.TextIOWrapper(io.BytesIO(work_response["Body"].read()), encoding=settings_list[SETTINGS_ITEM["charCode"]], newline=LINE_FEED_CODE[settings_list[SETTINGS_ITEM["lineFeedCode"]]]) work_data = io.TextIOWrapper(io.BytesIO(work_response["Body"].read()), encoding=settings_list[SETTINGS_ITEM["charCode"]], newline=LINE_FEED_CODE[settings_list[SETTINGS_ITEM["lineFeedCode"]]])
process_count = 0 # 処理件数カウンタ process_count = 0 # 処理件数カウンタ
@ -261,10 +257,9 @@ def main(bucket_name, target_data_source, target_file_name, settings_key, db_inf
try: try:
if ex_sql_file_exists: if ex_sql_file_exists:
# 拡張SQLファイルからSQL文生成 # 拡張SQLファイルからSQL文生成
ex_sqls_obj = s3_resource.Object(bucket_name, ex_sql_key) ex_sql_obj_response = s3_client.get_object(Bucket=bucket_name, Key=ex_sql_key)
ex_sql_response = ex_sqls_obj.get()
ex_sql = '' ex_sql = ''
for line in io.TextIOWrapper(io.BytesIO(ex_sql_response["Body"].read()), encoding='utf-8'): for line in io.TextIOWrapper(io.BytesIO(ex_sql_obj_response["Body"].read()), encoding='utf-8'):
ex_sql = f'{ex_sql} {line.rstrip()}' ex_sql = f'{ex_sql} {line.rstrip()}'
# トランザクション開始 # トランザクション開始
@ -358,3 +353,18 @@ def truncate_judge(settings_list):
class InvalidConfigException(Exception): class InvalidConfigException(Exception):
pass pass
# ローカル実行用コード
# 値はよしなに変えてください
# if __name__ == '__main__':
# DB_INFO = {"host": '127.0.0.1', "name": 'org02', "pass": 'user', "user": 'user'}
# main(
# bucket_name='バケット名',
# target_data_source='投入データのディレクトリ名よりデータソースに該当する部分',
# target_file_name='投入データのファイル名',
# settings_key='投入データに該当する個別設定ファイルのフルパス',
# db_info=DB_INFO,
# log_info='info',
# mode='i'
# )

View File

@ -1,2 +0,0 @@
boto3
PyMySQL

View File

@ -0,0 +1,12 @@
tests/*
.coverage
.env
.env.example
.report/*
.vscode/*
.pytest_cache/*
*/__pychache__/*
Dockerfile
pytest.ini
README.md
*.sql

View File

@ -0,0 +1,9 @@
DB_HOST=************
DB_PORT=3306
DB_USERNAME=************
DB_PASSWORD=************
DB_SCHEMA=*****
DUMP_BACKUP_BUCKET=************
LOG_LEVEL=INFO

11
ecs/export-dbdump/.gitignore vendored Normal file
View File

@ -0,0 +1,11 @@
.vscode/settings.json
.env
my.cnf
# python
__pycache__
# python test
.pytest_cache
.coverage
.report/

16
ecs/export-dbdump/.vscode/launch.json vendored Normal file
View File

@ -0,0 +1,16 @@
{
// IntelliSense 使
//
// : https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "(DEBUG) export dbdump",
"type": "python",
"request": "launch",
"program": "entrypoint.py",
"console": "integratedTerminal",
"justMyCode": true
}
]
}

View File

@ -0,0 +1,31 @@
{
"[python]": {
"editor.defaultFormatter": null,
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.organizeImports": true
}
},
//
"python.defaultInterpreterPath": "<pythonインタプリターのパス>",
"python.linting.lintOnSave": true,
"python.linting.enabled": true,
"python.linting.pylintEnabled": false,
"python.linting.flake8Enabled": true,
"python.linting.flake8Args": [
"--max-line-length=200",
"--ignore=F541"
],
"python.formatting.provider": "autopep8",
"python.formatting.autopep8Path": "autopep8",
"python.formatting.autopep8Args": [
"--max-line-length", "200",
"--ignore=F541"
],
"python.testing.pytestArgs": [
"tests/batch/ultmarc"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
}

View File

@ -0,0 +1,40 @@
FROM python:3.12-slim-bookworm
ENV TZ="Asia/Tokyo"
# pythonの標準出力をバッファリングしないフラグ
ENV PYTHONUNBUFFERED=1
# pythonのバイトコードを生成しないフラグ
ENV PYTHONDONTWRITEBYTECODE=1
WORKDIR /usr/src/app
COPY Pipfile Pipfile.lock ./
# mysql-apt-config をdpkgでインストールする際に標準出力に渡す文字列ファイルをコピー
COPY mysql_dpkg_selection.txt ./
# 必要なパッケージインストール
RUN apt update && apt install -y less vim curl wget gzip unzip sudo lsb-release
# mysqlをインストール
RUN \
wget https://dev.mysql.com/get/mysql-apt-config_0.8.29-1_all.deb && \
apt install -y gnupg && \
dpkg -i mysql-apt-config_0.8.29-1_all.deb < mysql_dpkg_selection.txt && \
apt update && \
apt install -y mysql-client
# aws cli v2 のインストール
RUN \
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && \
unzip awscliv2.zip && \
sudo ./aws/install
# python関連のライブラリインストール
RUN \
pip install --upgrade pip wheel setuptools && \
pip install pipenv --no-cache-dir && \
pipenv install --system --deploy && \
pip uninstall -y pipenv virtualenv-clone virtualenv
COPY src ./src
COPY entrypoint.py entrypoint.py
CMD ["python", "entrypoint.py"]

16
ecs/export-dbdump/Pipfile Normal file
View File

@ -0,0 +1,16 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
[dev-packages]
autopep8 = "*"
flake8 = "*"
[requires]
python_version = "3.12"
[pipenv]
allow_prereleases = true

63
ecs/export-dbdump/Pipfile.lock generated Normal file
View File

@ -0,0 +1,63 @@
{
"_meta": {
"hash": {
"sha256": "2f7808325e11704ced6ad10c85e1d583663a03d7ccabaa9696ab1fe133a6b30c"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.12"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {},
"develop": {
"autopep8": {
"hashes": [
"sha256:89440a4f969197b69a995e4ce0661b031f455a9f776d2c5ba3dbd83466931758",
"sha256:ce8ad498672c845a0c3de2629c15b635ec2b05ef8177a6e7c91c74f3e9b51128"
],
"index": "pypi",
"markers": "python_version >= '3.9'",
"version": "==2.3.2"
},
"flake8": {
"hashes": [
"sha256:1cbc62e65536f65e6d754dfe6f1bada7f5cf392d6f5db3c2b85892466c3e7c1a",
"sha256:c586ffd0b41540951ae41af572e6790dbd49fc12b3aa2541685d253d9bd504bd"
],
"index": "pypi",
"markers": "python_full_version >= '3.8.1'",
"version": "==7.1.2"
},
"mccabe": {
"hashes": [
"sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325",
"sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"
],
"markers": "python_version >= '3.6'",
"version": "==0.7.0"
},
"pycodestyle": {
"hashes": [
"sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3",
"sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521"
],
"markers": "python_version >= '3.8'",
"version": "==2.12.1"
},
"pyflakes": {
"hashes": [
"sha256:1c61603ff154621fb2a9172037d84dca3500def8c8b630657d1701f026f8af3f",
"sha256:84b5be138a2dfbb40689ca07e2152deb896a65c3a3e24c251c5c62489568074a"
],
"markers": "python_version >= '3.8'",
"version": "==3.2.0"
}
}
}

View File

@ -0,0 +1,48 @@
# 【共通】DBダンプ取得 
## 概要
当処理は特定の機能で利用するものではなく、共通処理として要件に応じて実行することを想定している。
## 環境情報
- Python 3.9
- MySQL 8.23
- VSCode
## 環境構築
- Python の構築
- Merck_NewDWH 開発 2021 の Wiki、[Python 環境構築](https://nds-tyo.backlog.com/alias/wiki/1874930)を参照
- 「Pipenv の導入」までを行っておくこと
- 構築完了後、プロジェクト配下で以下のコマンドを実行し、Python の仮想環境を作成する
- `pipenv install --dev --python <pyenvでインストールしたpythonバージョン>`
- この手順で出力される仮想環境のパスは、後述する VSCode の設定手順で使用するため、控えておく
- MySQL の環境構築
- Windows の場合、以下のリンクからダウンロードする
- <https://dev.mysql.com/downloads/installer/>
- Docker を利用する場合、「newsdwh-tools」リポジトリの MySQL 設定を使用すると便利
- 「crm-table-to-ddl」フォルダ内で以下のコマンドを実行すると
- `docker-compose up -d`
- Docker の構築手順は、[Docker のセットアップ手順](https://nds-tyo.backlog.com/alias/wiki/1754332)を参照のこと
- データを投入する
- 立ち上げたデータベースに「src05」スキーマを作成する
- [ローカル開発用データ](https://ndstokyo.sharepoint.com/:f:/r/sites/merck-new-dwh-team/Shared%20Documents/03.NewDWH%E6%A7%8B%E7%AF%89%E3%83%95%E3%82%A7%E3%83%BC%E3%82%BA3/02.%E9%96%8B%E7%99%BA/90.%E9%96%8B%E7%99%BA%E5%85%B1%E6%9C%89/%E3%83%AD%E3%83%BC%E3%82%AB%E3%83%AB%E9%96%8B%E7%99%BA%E7%94%A8%E3%83%87%E3%83%BC%E3%82%BF?csf=1&web=1&e=VVcRUs)をダウンロードし、mysql コマンドを使用して復元する
- `mysql -h <ホスト名> -P <ポート> -u <ユーザー名> -p src05 < src05_dump.sql`
- 環境変数の設定
- 「.env.example」ファイルをコピーし、「.env」ファイルを作成する
- 環境変数を設定する。設定内容は PRJ メンバーより共有を受けてください
- VSCode の設定
- 「.vscode/recommended_settings.json」ファイルをコピーし、「settings.json」ファイルを作成する
- 「python.defaultInterpreterPath」を、Python の構築手順で作成した仮想環境のパスに変更する
## 実行
- VSCode 上で「F5」キーを押下すると、バッチ処理が起動する。
- 「entrypoint.py」が、バッチ処理のエントリーポイント。
- 実際の処理は、「src/jobctrl_dbdump.py」で行っている。
## フォルダ構成(工事中)

View File

@ -0,0 +1,10 @@
"""【共通】DBダンプ取得処理のエントリーポイント"""
from src import jobctrl_dbdump
if __name__ == '__main__':
try:
exit(jobctrl_dbdump.exec())
except Exception:
# エラーが起きても、正常系のコードで返す。
# エラーが起きた事実はbatch_process内でログを出す。
exit(0)

View File

@ -0,0 +1,3 @@
1
1
4

View File

View File

@ -0,0 +1,106 @@
"""DBダンプ取得"""
import datetime
import os
import subprocess
import textwrap
from src.logging.get_logger import get_logger
from src.system_var import constants, environment
logger = get_logger('DBダンプ取得')
def exec():
try:
logger.info('DBダンプ取得開始')
# 事前処理(共通処理としては空振りする)
_pre_exec()
# メイン処理
# MySQL接続情報を作成する
my_cnf_file_content = f"""
[client]
user={environment.DB_USERNAME}
password={environment.DB_PASSWORD}
host={environment.DB_HOST}
"""
# my.cnfファイルのパス
my_cnf_path = os.path.join('my.cnf')
# my.cnfファイルを生成する
with open(my_cnf_path, 'w') as f:
f.write(textwrap.dedent(my_cnf_file_content)[1:-1])
# ファイルのパーミッションが強いとmysqldumpコマンドが実行できないため
# my.cnfファイルのパーミッションをread-onlyに設定
os.chmod(my_cnf_path, 0o444)
dt_now = datetime.datetime.now()
converted_value = dt_now.strftime('%Y%m%d%H%M%S%f')
dump_file_name = f'backup_rds_{environment.DB_SCHEMA}_{converted_value}.gz'
s3_file_path = f's3://{environment.DUMP_BACKUP_BUCKET}/{constants.DUMP_BACKUP_FOLDER}/{dt_now.year}/{dt_now.strftime("%m")}/{dt_now.strftime("%d")}/{dump_file_name}'
# mysqldumpコマンドを実行し、dumpを取得する
command = [
'mysqldump',
f'--defaults-file={my_cnf_path}',
'-P',
f"{environment.DB_PORT}",
'--no-tablespaces',
'--skip-column-statistics',
'--single-transaction',
'--set-gtid-purged=OFF',
environment.DB_SCHEMA
]
mysqldump_process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# gzipコマンドを実行してdump結果を圧縮する
gzip_process = subprocess.Popen(['gzip', '-c'], stdin=mysqldump_process.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# aws s3 cpコマンドを実行してアップロードする
s3_cp_process = subprocess.Popen(['aws', 's3', 'cp', '-', s3_file_path], stdin=gzip_process.stdout, stderr=subprocess.PIPE)
# mysqldumpの標準出力をgzipに接続したため、標準出力をクローズする
mysqldump_process.stdout.close()
# gzipの標準出力をaws s3 cpに接続したため、標準出力をクローズする
gzip_process.stdout.close()
# パイプラインを実行し、エラーハンドリング
_, error = mysqldump_process.communicate()
if mysqldump_process.returncode != 0:
raise Exception(f'`mysqldump`実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
_, error = gzip_process.communicate()
if gzip_process.returncode != 0:
raise Exception(f'`gzip`実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
_, error = s3_cp_process.communicate()
if s3_cp_process.returncode != 0:
raise Exception(f'`aws s3 cp`実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
# 事後処理(共通処理としては空振りする)
_post_exec()
logger.info('DBダンプ取得終了正常終了')
logger.info(f'出力ファイルパス: {s3_file_path}')
return constants.BATCH_EXIT_CODE_SUCCESS
except Exception as e:
logger.exception(f'DBダンプ取得中に想定外のエラーが発生しました :{e}')
return constants.BATCH_EXIT_CODE_SUCCESS
def _pre_exec():
"""
ダンプ復元 事前処理
共通機能としては事前処理を実装しない
事前処理が必要なダンプ復元処理を実装する場合当ロジックをコピーする
"""
pass
def _post_exec():
"""
ダンプ復元 事後処理
共通機能としては事後処理を実装しない
事後処理が必要なダンプ復元処理を実装する場合当ロジックをコピーする
"""
pass

View File

@ -0,0 +1,37 @@
import logging
from src.system_var.environment import LOG_LEVEL
# boto3関連モジュールのログレベルを事前に個別指定し、モジュール内のDEBUGログの表示を抑止する
for name in ["boto3", "botocore", "s3transfer", "urllib3"]:
logging.getLogger(name).setLevel(logging.WARNING)
def get_logger(log_name: str) -> logging.Logger:
"""一意のログ出力モジュールを取得します。
Args:
log_name (str): ロガー名
Returns:
_type_: _description_
"""
logger = logging.getLogger(log_name)
level = logging.getLevelName(LOG_LEVEL)
if not isinstance(level, int):
level = logging.INFO
logger.setLevel(level)
if not logger.hasHandlers():
handler = logging.StreamHandler()
logger.addHandler(handler)
formatter = logging.Formatter(
'%(name)s\t[%(levelname)s]\t%(asctime)s\t%(message)s',
'%Y-%m-%d %H:%M:%S'
)
for handler in logger.handlers:
handler.setFormatter(formatter)
return logger

View File

@ -0,0 +1,5 @@
# バッチ正常終了コード
BATCH_EXIT_CODE_SUCCESS = 0
# ダンプバックアップフォルダー
DUMP_BACKUP_FOLDER = 'dump'

View File

@ -0,0 +1,19 @@
import os
# Database
DB_HOST = os.environ['DB_HOST']
DB_PORT = int(os.environ['DB_PORT'])
DB_USERNAME = os.environ['DB_USERNAME']
DB_PASSWORD = os.environ['DB_PASSWORD']
DB_SCHEMA = os.environ['DB_SCHEMA']
# AWS
DUMP_BACKUP_BUCKET = os.environ['DUMP_BACKUP_BUCKET']
# 初期値がある環境変数
LOG_LEVEL = os.environ.get('LOG_LEVEL', 'INFO')
DB_CONNECTION_MAX_RETRY_ATTEMPT = int(os.environ.get('DB_CONNECTION_MAX_RETRY_ATTEMPT', 4))
DB_CONNECTION_RETRY_INTERVAL_INIT = int(os.environ.get('DB_CONNECTION_RETRY_INTERVAL', 5))
DB_CONNECTION_RETRY_INTERVAL_MIN_SECONDS = int(os.environ.get('DB_CONNECTION_RETRY_MIN_SECONDS', 5))
DB_CONNECTION_RETRY_INTERVAL_MAX_SECONDS = int(os.environ.get('DB_CONNECTION_RETRY_MAX_SECONDS', 50))

View File

@ -18,83 +18,20 @@
"default": { "default": {
"boto3": { "boto3": {
"hashes": [ "hashes": [
"sha256:791523f41b5e731c8ac0d2f65b978348fb6c92f02e0dbc9a7bc0b3760195cc60", "sha256:3faa2c328a61745f3215a63039606a6fcf55d9afe1cc76e3a5e27b9db58cdbf6",
"sha256:795803812b78260cfe019ef65021ec9e8049bfb6a027564e1a3b59c1a4a11106" "sha256:b998edac72f6740bd5d9d585cf3880f2dfeb4842e626b34430fd0e9623378011"
], ],
"index": "pypi", "index": "pypi",
"version": "==1.34.24" "markers": "python_version >= '3.9'",
"version": "==1.38.32"
}, },
"botocore": { "botocore": {
"hashes": [ "hashes": [
"sha256:4a48c15b87c6a72719a6c2d8688f4e52fe52c18ac9dfcaa25c7e62c5df475ee2", "sha256:0899a090e352cb5eeaae2c7bb52a987b469d23912c7ece86664dfb5c2e074978",
"sha256:c92f810b5faec5126f3faf7dc7d77346e407ab3b89bb613290c86ff2fd5405b9" "sha256:64ab919a5d8b74dd73eaac1f978d0e674d11ff3bbe8815c3d2982477be9a082c"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.9'",
"version": "==1.34.24" "version": "==1.38.32"
},
"greenlet": {
"hashes": [
"sha256:01bc7ea167cf943b4c802068e178bbf70ae2e8c080467070d01bfa02f337ee67",
"sha256:0448abc479fab28b00cb472d278828b3ccca164531daab4e970a0458786055d6",
"sha256:086152f8fbc5955df88382e8a75984e2bb1c892ad2e3c80a2508954e52295257",
"sha256:098d86f528c855ead3479afe84b49242e174ed262456c342d70fc7f972bc13c4",
"sha256:149e94a2dd82d19838fe4b2259f1b6b9957d5ba1b25640d2380bea9c5df37676",
"sha256:1551a8195c0d4a68fac7a4325efac0d541b48def35feb49d803674ac32582f61",
"sha256:15d79dd26056573940fcb8c7413d84118086f2ec1a8acdfa854631084393efcc",
"sha256:1996cb9306c8595335bb157d133daf5cf9f693ef413e7673cb07e3e5871379ca",
"sha256:1a7191e42732df52cb5f39d3527217e7ab73cae2cb3694d241e18f53d84ea9a7",
"sha256:1ea188d4f49089fc6fb283845ab18a2518d279c7cd9da1065d7a84e991748728",
"sha256:1f672519db1796ca0d8753f9e78ec02355e862d0998193038c7073045899f305",
"sha256:2516a9957eed41dd8f1ec0c604f1cdc86758b587d964668b5b196a9db5bfcde6",
"sha256:2797aa5aedac23af156bbb5a6aa2cd3427ada2972c828244eb7d1b9255846379",
"sha256:2dd6e660effd852586b6a8478a1d244b8dc90ab5b1321751d2ea15deb49ed414",
"sha256:3ddc0f794e6ad661e321caa8d2f0a55ce01213c74722587256fb6566049a8b04",
"sha256:3ed7fb269f15dc662787f4119ec300ad0702fa1b19d2135a37c2c4de6fadfd4a",
"sha256:419b386f84949bf0e7c73e6032e3457b82a787c1ab4a0e43732898a761cc9dbf",
"sha256:43374442353259554ce33599da8b692d5aa96f8976d567d4badf263371fbe491",
"sha256:52f59dd9c96ad2fc0d5724107444f76eb20aaccb675bf825df6435acb7703559",
"sha256:57e8974f23e47dac22b83436bdcf23080ade568ce77df33159e019d161ce1d1e",
"sha256:5b51e85cb5ceda94e79d019ed36b35386e8c37d22f07d6a751cb659b180d5274",
"sha256:649dde7de1a5eceb258f9cb00bdf50e978c9db1b996964cd80703614c86495eb",
"sha256:64d7675ad83578e3fc149b617a444fab8efdafc9385471f868eb5ff83e446b8b",
"sha256:68834da854554926fbedd38c76e60c4a2e3198c6fbed520b106a8986445caaf9",
"sha256:6b66c9c1e7ccabad3a7d037b2bcb740122a7b17a53734b7d72a344ce39882a1b",
"sha256:70fb482fdf2c707765ab5f0b6655e9cfcf3780d8d87355a063547b41177599be",
"sha256:7170375bcc99f1a2fbd9c306f5be8764eaf3ac6b5cb968862cad4c7057756506",
"sha256:73a411ef564e0e097dbe7e866bb2dda0f027e072b04da387282b02c308807405",
"sha256:77457465d89b8263bca14759d7c1684df840b6811b2499838cc5b040a8b5b113",
"sha256:7f362975f2d179f9e26928c5b517524e89dd48530a0202570d55ad6ca5d8a56f",
"sha256:81bb9c6d52e8321f09c3d165b2a78c680506d9af285bfccbad9fb7ad5a5da3e5",
"sha256:881b7db1ebff4ba09aaaeae6aa491daeb226c8150fc20e836ad00041bcb11230",
"sha256:894393ce10ceac937e56ec00bb71c4c2f8209ad516e96033e4b3b1de270e200d",
"sha256:99bf650dc5d69546e076f413a87481ee1d2d09aaaaaca058c9251b6d8c14783f",
"sha256:9da2bd29ed9e4f15955dd1595ad7bc9320308a3b766ef7f837e23ad4b4aac31a",
"sha256:afaff6cf5200befd5cec055b07d1c0a5a06c040fe5ad148abcd11ba6ab9b114e",
"sha256:b1b5667cced97081bf57b8fa1d6bfca67814b0afd38208d52538316e9422fc61",
"sha256:b37eef18ea55f2ffd8f00ff8fe7c8d3818abd3e25fb73fae2ca3b672e333a7a6",
"sha256:b542be2440edc2d48547b5923c408cbe0fc94afb9f18741faa6ae970dbcb9b6d",
"sha256:b7dcbe92cc99f08c8dd11f930de4d99ef756c3591a5377d1d9cd7dd5e896da71",
"sha256:b7f009caad047246ed379e1c4dbcb8b020f0a390667ea74d2387be2998f58a22",
"sha256:bba5387a6975598857d86de9eac14210a49d554a77eb8261cc68b7d082f78ce2",
"sha256:c5e1536de2aad7bf62e27baf79225d0d64360d4168cf2e6becb91baf1ed074f3",
"sha256:c5ee858cfe08f34712f548c3c363e807e7186f03ad7a5039ebadb29e8c6be067",
"sha256:c9db1c18f0eaad2f804728c67d6c610778456e3e1cc4ab4bbd5eeb8e6053c6fc",
"sha256:d353cadd6083fdb056bb46ed07e4340b0869c305c8ca54ef9da3421acbdf6881",
"sha256:d46677c85c5ba00a9cb6f7a00b2bfa6f812192d2c9f7d9c4f6a55b60216712f3",
"sha256:d4d1ac74f5c0c0524e4a24335350edad7e5f03b9532da7ea4d3c54d527784f2e",
"sha256:d73a9fe764d77f87f8ec26a0c85144d6a951a6c438dfe50487df5595c6373eac",
"sha256:da70d4d51c8b306bb7a031d5cff6cc25ad253affe89b70352af5f1cb68e74b53",
"sha256:daf3cb43b7cf2ba96d614252ce1684c1bccee6b2183a01328c98d36fcd7d5cb0",
"sha256:dca1e2f3ca00b84a396bc1bce13dd21f680f035314d2379c4160c98153b2059b",
"sha256:dd4f49ae60e10adbc94b45c0b5e6a179acc1736cf7a90160b404076ee283cf83",
"sha256:e1f145462f1fa6e4a4ae3c0f782e580ce44d57c8f2c7aae1b6fa88c0b2efdb41",
"sha256:e3391d1e16e2a5a1507d83e4a8b100f4ee626e8eca43cf2cadb543de69827c4c",
"sha256:fcd2469d6a2cf298f198f0487e0a5b1a47a42ca0fa4dfd1b6862c999f018ebbf",
"sha256:fd096eb7ffef17c456cfa587523c5f92321ae02427ff955bebe9e3c63bc9f0da",
"sha256:fe754d231288e1e64323cfad462fcee8f0288654c10bdf4f603a39ed923bef33"
],
"markers": "platform_machine == 'aarch64' or (platform_machine == 'ppc64le' or (platform_machine == 'x86_64' or (platform_machine == 'amd64' or (platform_machine == 'AMD64' or (platform_machine == 'win32' or platform_machine == 'WIN32')))))",
"version": "==3.0.3"
}, },
"jmespath": { "jmespath": {
"hashes": [ "hashes": [
@ -106,217 +43,241 @@
}, },
"pymysql": { "pymysql": {
"hashes": [ "hashes": [
"sha256:4f13a7df8bf36a51e81dd9f3605fede45a4878fe02f9236349fd82a3f0612f96", "sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c",
"sha256:8969ec6d763c856f7073c4c64662882675702efcb114b4bcbb955aea3a069fa7" "sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0"
], ],
"index": "pypi", "index": "pypi",
"version": "==1.1.0" "markers": "python_version >= '3.7'",
"version": "==1.1.1"
}, },
"python-dateutil": { "python-dateutil": {
"hashes": [ "hashes": [
"sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86", "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3",
"sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9" "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"
], ],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'",
"version": "==2.8.2" "version": "==2.9.0.post0"
}, },
"s3transfer": { "s3transfer": {
"hashes": [ "hashes": [
"sha256:3cdb40f5cfa6966e812209d0994f2a4709b561c88e90cf00c2696d2df4e56b2e", "sha256:0148ef34d6dd964d0d8cf4311b2b21c474693e57c2e069ec708ce043d2b527be",
"sha256:d0c8bbf672d5eebbe4e57945e23b972d963f07d82f661cabf678a5c88831595b" "sha256:f5e6db74eb7776a37208001113ea7aa97695368242b364d73e91c981ac522177"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.9'",
"version": "==0.10.0" "version": "==0.13.0"
}, },
"six": { "six": {
"hashes": [ "hashes": [
"sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926", "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274",
"sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254" "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"
], ],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'",
"version": "==1.16.0" "version": "==1.17.0"
}, },
"sqlalchemy": { "sqlalchemy": {
"hashes": [ "hashes": [
"sha256:0d3cab3076af2e4aa5693f89622bef7fa770c6fec967143e4da7508b3dceb9b9", "sha256:023b3ee6169969beea3bb72312e44d8b7c27c75b347942d943cf49397b7edeb5",
"sha256:0dacf67aee53b16f365c589ce72e766efaabd2b145f9de7c917777b575e3659d", "sha256:03968a349db483936c249f4d9cd14ff2c296adfa1290b660ba6516f973139582",
"sha256:10331f129982a19df4284ceac6fe87353ca3ca6b4ca77ff7d697209ae0a5915e", "sha256:05132c906066142103b83d9c250b60508af556982a385d96c4eaa9fb9720ac2b",
"sha256:14a6f68e8fc96e5e8f5647ef6cda6250c780612a573d99e4d881581432ef1669", "sha256:087b6b52de812741c27231b5a3586384d60c353fbd0e2f81405a814b5591dc8b",
"sha256:1b1180cda6df7af84fe72e4530f192231b1f29a7496951db4ff38dac1687202d", "sha256:0b3dbf1e7e9bc95f4bac5e2fb6d3fb2f083254c3fdd20a1789af965caf2d2348",
"sha256:29049e2c299b5ace92cbed0c1610a7a236f3baf4c6b66eb9547c01179f638ec5", "sha256:118c16cd3f1b00c76d69343e38602006c9cfb9998fa4f798606d28d63f23beda",
"sha256:342d365988ba88ada8af320d43df4e0b13a694dbd75951f537b2d5e4cb5cd002", "sha256:1936af879e3db023601196a1684d28e12f19ccf93af01bf3280a3262c4b6b4e5",
"sha256:420362338681eec03f53467804541a854617faed7272fe71a1bfdb07336a381e", "sha256:1e3f196a0c59b0cae9a0cd332eb1a4bda4696e863f4f1cf84ab0347992c548c2",
"sha256:4344d059265cc8b1b1be351bfb88749294b87a8b2bbe21dfbe066c4199541ebd", "sha256:23a8825495d8b195c4aa9ff1c430c28f2c821e8c5e2d98089228af887e5d7e29",
"sha256:4f7a7d7fcc675d3d85fbf3b3828ecd5990b8d61bd6de3f1b260080b3beccf215", "sha256:293cd444d82b18da48c9f71cd7005844dbbd06ca19be1ccf6779154439eec0b8",
"sha256:555651adbb503ac7f4cb35834c5e4ae0819aab2cd24857a123370764dc7d7e24", "sha256:32f9dc8c44acdee06c8fc6440db9eae8b4af8b01e4b1aee7bdd7241c22edff4f",
"sha256:59a21853f5daeb50412d459cfb13cb82c089ad4c04ec208cd14dddd99fc23b39", "sha256:34ea30ab3ec98355235972dadc497bb659cc75f8292b760394824fab9cf39826",
"sha256:5fdd402169aa00df3142149940b3bf9ce7dde075928c1886d9a1df63d4b8de62", "sha256:3d3549fc3e40667ec7199033a4e40a2f669898a00a7b18a931d3efb4c7900504",
"sha256:605b6b059f4b57b277f75ace81cc5bc6335efcbcc4ccb9066695e515dbdb3900", "sha256:41836fe661cc98abfae476e14ba1906220f92c4e528771a8a3ae6a151242d2ae",
"sha256:665f0a3954635b5b777a55111ababf44b4fc12b1f3ba0a435b602b6387ffd7cf", "sha256:4d44522480e0bf34c3d63167b8cfa7289c1c54264c2950cc5fc26e7850967e45",
"sha256:6f9e2e59cbcc6ba1488404aad43de005d05ca56e069477b33ff74e91b6319735", "sha256:4eeb195cdedaf17aab6b247894ff2734dcead6c08f748e617bfe05bd5a218443",
"sha256:736ea78cd06de6c21ecba7416499e7236a22374561493b456a1f7ffbe3f6cdb4", "sha256:4f67766965996e63bb46cfbf2ce5355fc32d9dd3b8ad7e536a920ff9ee422e23",
"sha256:74b080c897563f81062b74e44f5a72fa44c2b373741a9ade701d5f789a10ba23", "sha256:57df5dc6fdb5ed1a88a1ed2195fd31927e705cad62dedd86b46972752a80f576",
"sha256:75432b5b14dc2fff43c50435e248b45c7cdadef73388e5610852b95280ffd0e9", "sha256:598d9ebc1e796431bbd068e41e4de4dc34312b7aa3292571bb3674a0cb415dd1",
"sha256:75f99202324383d613ddd1f7455ac908dca9c2dd729ec8584c9541dd41822a2c", "sha256:5b14e97886199c1f52c14629c11d90c11fbb09e9334fa7bb5f6d068d9ced0ce0",
"sha256:790f533fa5c8901a62b6fef5811d48980adeb2f51f1290ade8b5e7ba990ba3de", "sha256:5e22575d169529ac3e0a120cf050ec9daa94b6a9597993d1702884f6954a7d71",
"sha256:798f717ae7c806d67145f6ae94dc7c342d3222d3b9a311a784f371a4333212c7", "sha256:60c578c45c949f909a4026b7807044e7e564adf793537fc762b2489d522f3d11",
"sha256:7c88f0c7dcc5f99bdb34b4fd9b69b93c89f893f454f40219fe923a3a2fd11625", "sha256:6145afea51ff0af7f2564a05fa95eb46f542919e6523729663a5d285ecb3cf5e",
"sha256:7d505815ac340568fd03f719446a589162d55c52f08abd77ba8964fbb7eb5b5f", "sha256:6375cd674fe82d7aa9816d1cb96ec592bac1726c11e0cafbf40eeee9a4516b5f",
"sha256:84daa0a2055df9ca0f148a64fdde12ac635e30edbca80e87df9b3aaf419e144a", "sha256:6854175807af57bdb6425e47adbce7d20a4d79bbfd6f6d6519cd10bb7109a7f8",
"sha256:87d91043ea0dc65ee583026cb18e1b458d8ec5fc0a93637126b5fc0bc3ea68c4", "sha256:6ab60a5089a8f02009f127806f777fca82581c49e127f08413a66056bd9166dd",
"sha256:87f6e732bccd7dcf1741c00f1ecf33797383128bd1c90144ac8adc02cbb98643", "sha256:725875a63abf7c399d4548e686debb65cdc2549e1825437096a0af1f7e374814",
"sha256:884272dcd3ad97f47702965a0e902b540541890f468d24bd1d98bcfe41c3f018", "sha256:7492967c3386df69f80cf67efd665c0f667cee67032090fe01d7d74b0e19bb08",
"sha256:8b8cb63d3ea63b29074dcd29da4dc6a97ad1349151f2d2949495418fd6e48db9", "sha256:81965cc20848ab06583506ef54e37cf15c83c7e619df2ad16807c03100745dea",
"sha256:91f7d9d1c4dd1f4f6e092874c128c11165eafcf7c963128f79e28f8445de82d5", "sha256:81c24e0c0fde47a9723c81d5806569cddef103aebbf79dbc9fcbb617153dea30",
"sha256:a2c69a7664fb2d54b8682dd774c3b54f67f84fa123cf84dda2a5f40dcaa04e08", "sha256:81eedafa609917040d39aa9332e25881a8e7a0862495fcdf2023a9667209deda",
"sha256:a3be4987e3ee9d9a380b66393b77a4cd6d742480c951a1c56a23c335caca4ce3", "sha256:81f413674d85cfd0dfcd6512e10e0f33c19c21860342a4890c3a2b59479929f9",
"sha256:a86b4240e67d4753dc3092d9511886795b3c2852abe599cffe108952f7af7ac3", "sha256:8280856dd7c6a68ab3a164b4a4b1c51f7691f6d04af4d4ca23d6ecf2261b7923",
"sha256:aa9373708763ef46782d10e950b49d0235bfe58facebd76917d3f5cbf5971aed", "sha256:82ca366a844eb551daff9d2e6e7a9e5e76d2612c8564f58db6c19a726869c1df",
"sha256:b64b183d610b424a160b0d4d880995e935208fc043d0302dd29fee32d1ee3f95", "sha256:8b4af17bda11e907c51d10686eda89049f9ce5669b08fbe71a29747f1e876036",
"sha256:b801154027107461ee992ff4b5c09aa7cc6ec91ddfe50d02bca344918c3265c6", "sha256:90144d3b0c8b139408da50196c5cad2a6909b51b23df1f0538411cd23ffa45d3",
"sha256:bb209a73b8307f8fe4fe46f6ad5979649be01607f11af1eb94aa9e8a3aaf77f0", "sha256:906e6b0d7d452e9a98e5ab8507c0da791856b2380fdee61b765632bb8698026f",
"sha256:bc8b7dabe8e67c4832891a5d322cec6d44ef02f432b4588390017f5cec186a84", "sha256:90c11ceb9a1f482c752a71f203a81858625d8df5746d787a4786bca4ffdf71c6",
"sha256:c51db269513917394faec5e5c00d6f83829742ba62e2ac4fa5c98d58be91662f", "sha256:911cc493ebd60de5f285bcae0491a60b4f2a9f0f5c270edd1c4dbaef7a38fc04",
"sha256:c55731c116806836a5d678a70c84cb13f2cedba920212ba7dcad53260997666d", "sha256:9a420a91913092d1e20c86a2f5f1fc85c1a8924dbcaf5e0586df8aceb09c9cc2",
"sha256:cf18ff7fc9941b8fc23437cc3e68ed4ebeff3599eec6ef5eebf305f3d2e9a7c2", "sha256:9f8c9fdd15a55d9465e590a402f42082705d66b05afc3ffd2d2eb3c6ba919560",
"sha256:d24f571990c05f6b36a396218f251f3e0dda916e0c687ef6fdca5072743208f5", "sha256:a104c5694dfd2d864a6f91b0956eb5d5883234119cb40010115fd45a16da5e70",
"sha256:db854730a25db7c956423bb9fb4bdd1216c839a689bf9cc15fada0a7fb2f4570", "sha256:a373a400f3e9bac95ba2a06372c4fd1412a7cee53c37fc6c05f829bf672b8769",
"sha256:dc55990143cbd853a5d038c05e79284baedf3e299661389654551bd02a6a68d7", "sha256:a62448526dd9ed3e3beedc93df9bb6b55a436ed1474db31a2af13b313a70a7e1",
"sha256:e607cdd99cbf9bb80391f54446b86e16eea6ad309361942bf88318bcd452363c", "sha256:a8808d5cf866c781150d36a3c8eb3adccfa41a8105d031bf27e92c251e3969d6",
"sha256:ecf6d4cda1f9f6cb0b45803a01ea7f034e2f1aed9475e883410812d9f9e3cfcf", "sha256:b1f09b6821406ea1f94053f346f28f8215e293344209129a9c0fcc3578598d7b",
"sha256:f2a159111a0f58fb034c93eeba211b4141137ec4b0a6e75789ab7a3ef3c7e7e3", "sha256:b2ac41acfc8d965fb0c464eb8f44995770239668956dc4cdf502d1b1ffe0d747",
"sha256:f37c0caf14b9e9b9e8f6dbc81bc56db06acb4363eba5a633167781a48ef036ed", "sha256:b46fa6eae1cd1c20e6e6f44e19984d438b6b2d8616d21d783d150df714f44078",
"sha256:f5693145220517b5f42393e07a6898acdfe820e136c98663b971906120549da5" "sha256:b50eab9994d64f4a823ff99a0ed28a6903224ddbe7fef56a6dd865eec9243440",
"sha256:bfc9064f6658a3d1cadeaa0ba07570b83ce6801a1314985bf98ec9b95d74e15f",
"sha256:c0b0e5e1b5d9f3586601048dd68f392dc0cc99a59bb5faf18aab057ce00d00b2",
"sha256:c153265408d18de4cc5ded1941dcd8315894572cddd3c58df5d5b5705b3fa28d",
"sha256:d4ae769b9c1c7757e4ccce94b0641bc203bbdf43ba7a2413ab2523d8d047d8dc",
"sha256:dc56c9788617b8964ad02e8fcfeed4001c1f8ba91a9e1f31483c0dffb207002a",
"sha256:dd5ec3aa6ae6e4d5b5de9357d2133c07be1aff6405b136dad753a16afb6717dd",
"sha256:edba70118c4be3c2b1f90754d308d0b79c6fe2c0fdc52d8ddf603916f83f4db9",
"sha256:ff8e80c4c4932c10493ff97028decfdb622de69cae87e0f127a7ebe32b4069c6"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.0.25" "markers": "python_version >= '3.7'",
"version": "==2.0.41"
}, },
"tenacity": { "tenacity": {
"hashes": [ "hashes": [
"sha256:5398ef0d78e63f40007c1fb4c0bff96e1911394d2fa8d194f77619c05ff6cc8a", "sha256:1169d376c297e7de388d18b4481760d478b0e99a777cad3a9c86e556f4b697cb",
"sha256:ce510e327a630c9e1beaf17d42e6ffacc88185044ad85cf74c0a8887c6a0f88c" "sha256:f77bf36710d8b73a50b2dd155c97b870017ad21afe6ab300326b0371b3b05138"
], ],
"index": "pypi", "index": "pypi",
"version": "==8.2.3" "markers": "python_version >= '3.9'",
"version": "==9.1.2"
}, },
"typing-extensions": { "typing-extensions": {
"hashes": [ "hashes": [
"sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783", "sha256:8676b788e32f02ab42d9e7c61324048ae4c6d844a399eebace3d4979d75ceef4",
"sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd" "sha256:a1514509136dd0b477638fc68d6a91497af5076466ad0fa6c338e44e359944af"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.9'",
"version": "==4.9.0" "version": "==4.14.0"
}, },
"urllib3": { "urllib3": {
"hashes": [ "hashes": [
"sha256:34b97092d7e0a3a8cf7cd10e386f401b3737364026c45e622aa02903dffe0f07", "sha256:0ed14ccfbf1c30a9072c7ca157e4319b70d65f623e91e7b32fadb2853431016e",
"sha256:f8ecc1bba5667413457c529ab955bf8c67b45db799d159066261719e328580a0" "sha256:40c2dc0c681e47eb8f90e7e27bf6ff7df2e677421fd46756da1161c39ca70d32"
], ],
"markers": "python_version < '3.10'", "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'",
"version": "==1.26.18" "version": "==1.26.20"
} }
}, },
"develop": { "develop": {
"autopep8": { "autopep8": {
"hashes": [ "hashes": [
"sha256:067959ca4a07b24dbd5345efa8325f5f58da4298dab0dde0443d5ed765de80cb", "sha256:8d6c87eba648fdcfc83e29b788910b8643171c395d9c4bcf115ece035b9c9dda",
"sha256:2913064abd97b3419d1cc83ea71f042cb821f87e45b9c88cad5ad3c4ea87fe0c" "sha256:a203fe0fcad7939987422140ab17a930f684763bf7335bdb6709991dd7ef6c2d"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.0.4" "markers": "python_version >= '3.8'",
"version": "==2.3.1"
}, },
"boto3": { "boto3": {
"hashes": [ "hashes": [
"sha256:791523f41b5e731c8ac0d2f65b978348fb6c92f02e0dbc9a7bc0b3760195cc60", "sha256:9edf49640c79a05b0a72f4c2d1e24dfc164344b680535a645f455ac624dc3680",
"sha256:795803812b78260cfe019ef65021ec9e8049bfb6a027564e1a3b59c1a4a11106" "sha256:db58348849a5af061f0f5ec9c3b699da5221ca83354059fdccb798e3ddb6b62a"
], ],
"index": "pypi", "index": "pypi",
"version": "==1.34.24" "markers": "python_version >= '3.8'",
"version": "==1.35.57"
}, },
"botocore": { "botocore": {
"hashes": [ "hashes": [
"sha256:4a48c15b87c6a72719a6c2d8688f4e52fe52c18ac9dfcaa25c7e62c5df475ee2", "sha256:92ddd02469213766872cb2399269dd20948f90348b42bf08379881d5e946cc34",
"sha256:c92f810b5faec5126f3faf7dc7d77346e407ab3b89bb613290c86ff2fd5405b9" "sha256:d96306558085baf0bcb3b022d7a8c39c93494f031edb376694d2b2dcd0e81327"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==1.34.24" "version": "==1.35.57"
}, },
"coverage": { "coverage": {
"extras": [ "extras": [
"toml" "toml"
], ],
"hashes": [ "hashes": [
"sha256:04387a4a6ecb330c1878907ce0dc04078ea72a869263e53c72a1ba5bbdf380ca", "sha256:00a1d69c112ff5149cabe60d2e2ee948752c975d95f1e1096742e6077affd376",
"sha256:0676cd0ba581e514b7f726495ea75aba3eb20899d824636c6f59b0ed2f88c471", "sha256:023bf8ee3ec6d35af9c1c6ccc1d18fa69afa1cb29eaac57cb064dbb262a517f9",
"sha256:0e8d06778e8fbffccfe96331a3946237f87b1e1d359d7fbe8b06b96c95a5407a", "sha256:0294ca37f1ba500667b1aef631e48d875ced93ad5e06fa665a3295bdd1d95111",
"sha256:0eb3c2f32dabe3a4aaf6441dde94f35687224dfd7eb2a7f47f3fd9428e421058", "sha256:06babbb8f4e74b063dbaeb74ad68dfce9186c595a15f11f5d5683f748fa1d172",
"sha256:109f5985182b6b81fe33323ab4707011875198c41964f014579cf82cebf2bb85", "sha256:0809082ee480bb8f7416507538243c8863ac74fd8a5d2485c46f0f7499f2b491",
"sha256:13eaf476ec3e883fe3e5fe3707caeb88268a06284484a3daf8250259ef1ba143", "sha256:0b3fb02fe73bed561fa12d279a417b432e5b50fe03e8d663d61b3d5990f29546",
"sha256:164fdcc3246c69a6526a59b744b62e303039a81e42cfbbdc171c91a8cc2f9446", "sha256:0b58c672d14f16ed92a48db984612f5ce3836ae7d72cdd161001cc54512571f2",
"sha256:26776ff6c711d9d835557ee453082025d871e30b3fd6c27fcef14733f67f0590", "sha256:0bcd1069e710600e8e4cf27f65c90c7843fa8edfb4520fb0ccb88894cad08b11",
"sha256:26f66da8695719ccf90e794ed567a1549bb2644a706b41e9f6eae6816b398c4a", "sha256:1032e178b76a4e2b5b32e19d0fd0abbce4b58e77a1ca695820d10e491fa32b08",
"sha256:29f3abe810930311c0b5d1a7140f6395369c3db1be68345638c33eec07535105", "sha256:11a223a14e91a4693d2d0755c7a043db43d96a7450b4f356d506c2562c48642c",
"sha256:316543f71025a6565677d84bc4df2114e9b6a615aa39fb165d697dba06a54af9", "sha256:12394842a3a8affa3ba62b0d4ab7e9e210c5e366fbac3e8b2a68636fb19892c2",
"sha256:36b0ea8ab20d6a7564e89cb6135920bc9188fb5f1f7152e94e8300b7b189441a", "sha256:182e6cd5c040cec0a1c8d415a87b67ed01193ed9ad458ee427741c7d8513d963",
"sha256:3cc9d4bc55de8003663ec94c2f215d12d42ceea128da8f0f4036235a119c88ac", "sha256:1d5b8007f81b88696d06f7df0cb9af0d3b835fe0c8dbf489bad70b45f0e45613",
"sha256:485e9f897cf4856a65a57c7f6ea3dc0d4e6c076c87311d4bc003f82cfe199d25", "sha256:1f76846299ba5c54d12c91d776d9605ae33f8ae2b9d1d3c3703cf2db1a67f2c0",
"sha256:5040148f4ec43644702e7b16ca864c5314ccb8ee0751ef617d49aa0e2d6bf4f2", "sha256:27fb4a050aaf18772db513091c9c13f6cb94ed40eacdef8dad8411d92d9992db",
"sha256:51456e6fa099a8d9d91497202d9563a320513fcf59f33991b0661a4a6f2ad450", "sha256:29155cd511ee058e260db648b6182c419422a0d2e9a4fa44501898cf918866cf",
"sha256:53d7d9158ee03956e0eadac38dfa1ec8068431ef8058fe6447043db1fb40d932", "sha256:29fc0f17b1d3fea332f8001d4558f8214af7f1d87a345f3a133c901d60347c73",
"sha256:5a10a4920def78bbfff4eff8a05c51be03e42f1c3735be42d851f199144897ba", "sha256:2b6b4c83d8e8ea79f27ab80778c19bc037759aea298da4b56621f4474ffeb117",
"sha256:5b14b4f8760006bfdb6e08667af7bc2d8d9bfdb648351915315ea17645347137", "sha256:2fdef0d83a2d08d69b1f2210a93c416d54e14d9eb398f6ab2f0a209433db19e1",
"sha256:5b2ccb7548a0b65974860a78c9ffe1173cfb5877460e5a229238d985565574ae", "sha256:3c65d37f3a9ebb703e710befdc489a38683a5b152242664b973a7b7b22348a4e",
"sha256:697d1317e5290a313ef0d369650cfee1a114abb6021fa239ca12b4849ebbd614", "sha256:4f704f0998911abf728a7783799444fcbbe8261c4a6c166f667937ae6a8aa522",
"sha256:6ae8c9d301207e6856865867d762a4b6fd379c714fcc0607a84b92ee63feff70", "sha256:51b44306032045b383a7a8a2c13878de375117946d68dcb54308111f39775a25",
"sha256:707c0f58cb1712b8809ece32b68996ee1e609f71bd14615bd8f87a1293cb610e", "sha256:53d202fd109416ce011578f321460795abfe10bb901b883cafd9b3ef851bacfc",
"sha256:74775198b702868ec2d058cb92720a3c5a9177296f75bd97317c787daf711505", "sha256:58809e238a8a12a625c70450b48e8767cff9eb67c62e6154a642b21ddf79baea",
"sha256:756ded44f47f330666843b5781be126ab57bb57c22adbb07d83f6b519783b870", "sha256:5915fcdec0e54ee229926868e9b08586376cae1f5faa9bbaf8faf3561b393d52",
"sha256:76f03940f9973bfaee8cfba70ac991825611b9aac047e5c80d499a44079ec0bc", "sha256:5beb1ee382ad32afe424097de57134175fea3faf847b9af002cc7895be4e2a5a",
"sha256:79287fd95585ed36e83182794a57a46aeae0b64ca53929d1176db56aacc83451", "sha256:5f8ae553cba74085db385d489c7a792ad66f7f9ba2ee85bfa508aeb84cf0ba07",
"sha256:799c8f873794a08cdf216aa5d0531c6a3747793b70c53f70e98259720a6fe2d7", "sha256:5fbd612f8a091954a0c8dd4c0b571b973487277d26476f8480bfa4b2a65b5d06",
"sha256:7d360587e64d006402b7116623cebf9d48893329ef035278969fa3bbf75b697e", "sha256:6bd818b7ea14bc6e1f06e241e8234508b21edf1b242d49831831a9450e2f35fa",
"sha256:80b5ee39b7f0131ebec7968baa9b2309eddb35b8403d1869e08f024efd883566", "sha256:6f01ba56b1c0e9d149f9ac85a2f999724895229eb36bd997b61e62999e9b0901",
"sha256:815ac2d0f3398a14286dc2cea223a6f338109f9ecf39a71160cd1628786bc6f5", "sha256:73d2b73584446e66ee633eaad1a56aad577c077f46c35ca3283cd687b7715b0b",
"sha256:83c2dda2666fe32332f8e87481eed056c8b4d163fe18ecc690b02802d36a4d26", "sha256:7bb92c539a624cf86296dd0c68cd5cc286c9eef2d0c3b8b192b604ce9de20a17",
"sha256:846f52f46e212affb5bcf131c952fb4075b55aae6b61adc9856222df89cbe3e2", "sha256:8165b796df0bd42e10527a3f493c592ba494f16ef3c8b531288e3d0d72c1f6f0",
"sha256:936d38794044b26c99d3dd004d8af0035ac535b92090f7f2bb5aa9c8e2f5cd42", "sha256:862264b12ebb65ad8d863d51f17758b1684560b66ab02770d4f0baf2ff75da21",
"sha256:9864463c1c2f9cb3b5db2cf1ff475eed2f0b4285c2aaf4d357b69959941aa555", "sha256:8902dd6a30173d4ef09954bfcb24b5d7b5190cf14a43170e386979651e09ba19",
"sha256:995ea5c48c4ebfd898eacb098164b3cc826ba273b3049e4a889658548e321b43", "sha256:8cf717ee42012be8c0cb205dbbf18ffa9003c4cbf4ad078db47b95e10748eec5",
"sha256:a1526d265743fb49363974b7aa8d5899ff64ee07df47dd8d3e37dcc0818f09ed", "sha256:8ed9281d1b52628e81393f5eaee24a45cbd64965f41857559c2b7ff19385df51",
"sha256:a56de34db7b7ff77056a37aedded01b2b98b508227d2d0979d373a9b5d353daa", "sha256:99b41d18e6b2a48ba949418db48159d7a2e81c5cc290fc934b7d2380515bd0e3",
"sha256:a7c97726520f784239f6c62506bc70e48d01ae71e9da128259d61ca5e9788516", "sha256:9cb7fa111d21a6b55cbf633039f7bc2749e74932e3aa7cb7333f675a58a58bf3",
"sha256:b8e99f06160602bc64da35158bb76c73522a4010f0649be44a4e167ff8555952", "sha256:a181e99301a0ae128493a24cfe5cfb5b488c4e0bf2f8702091473d033494d04f",
"sha256:bb1de682da0b824411e00a0d4da5a784ec6496b6850fdf8c865c1d68c0e318dd", "sha256:a413a096c4cbac202433c850ee43fa326d2e871b24554da8327b01632673a076",
"sha256:bf477c355274a72435ceb140dc42de0dc1e1e0bf6e97195be30487d8eaaf1a09", "sha256:a6b1e54712ba3474f34b7ef7a41e65bd9037ad47916ccb1cc78769bae324c01a",
"sha256:bf635a52fc1ea401baf88843ae8708591aa4adff875e5c23220de43b1ccf575c", "sha256:ade3ca1e5f0ff46b678b66201f7ff477e8fa11fb537f3b55c3f0568fbfe6e718",
"sha256:bfd5db349d15c08311702611f3dccbef4b4e2ec148fcc636cf8739519b4a5c0f", "sha256:b0ac3d42cb51c4b12df9c5f0dd2f13a4f24f01943627120ec4d293c9181219ba",
"sha256:c530833afc4707fe48524a44844493f36d8727f04dcce91fb978c414a8556cc6", "sha256:b369ead6527d025a0fe7bd3864e46dbee3aa8f652d48df6174f8d0bac9e26e0e",
"sha256:cc6d65b21c219ec2072c1293c505cf36e4e913a3f936d80028993dd73c7906b1", "sha256:b57b768feb866f44eeed9f46975f3d6406380275c5ddfe22f531a2bf187eda27",
"sha256:cd3c1e4cb2ff0083758f09be0f77402e1bdf704adb7f89108007300a6da587d0", "sha256:b8d3a03d9bfcaf5b0141d07a88456bb6a4c3ce55c080712fec8418ef3610230e",
"sha256:cfd2a8b6b0d8e66e944d47cdec2f47c48fef2ba2f2dff5a9a75757f64172857e", "sha256:bc66f0bf1d7730a17430a50163bb264ba9ded56739112368ba985ddaa9c3bd09",
"sha256:d0ca5c71a5a1765a0f8f88022c52b6b8be740e512980362f7fdbb03725a0d6b9", "sha256:bf20494da9653f6410213424f5f8ad0ed885e01f7e8e59811f572bdb20b8972e",
"sha256:e7defbb9737274023e2d7af02cac77043c86ce88a907c58f42b580a97d5bcca9", "sha256:c48167910a8f644671de9f2083a23630fbf7a1cb70ce939440cd3328e0919f70",
"sha256:e9d1bf53c4c8de58d22e0e956a79a5b37f754ed1ffdbf1a260d9dcfa2d8a325e", "sha256:c481b47f6b5845064c65a7bc78bc0860e635a9b055af0df46fdf1c58cebf8e8f",
"sha256:ea81d8f9691bb53f4fb4db603203029643caffc82bf998ab5b59ca05560f4c06" "sha256:c7c8b95bf47db6d19096a5e052ffca0a05f335bc63cef281a6e8fe864d450a72",
"sha256:c9b8e184898ed014884ca84c70562b4a82cbc63b044d366fedc68bc2b2f3394a",
"sha256:cc8ff50b50ce532de2fa7a7daae9dd12f0a699bfcd47f20945364e5c31799fef",
"sha256:d541423cdd416b78626b55f123412fcf979d22a2c39fce251b350de38c15c15b",
"sha256:dab4d16dfef34b185032580e2f2f89253d302facba093d5fa9dbe04f569c4f4b",
"sha256:dacbc52de979f2823a819571f2e3a350a7e36b8cb7484cdb1e289bceaf35305f",
"sha256:df57bdbeffe694e7842092c5e2e0bc80fff7f43379d465f932ef36f027179806",
"sha256:ed8fe9189d2beb6edc14d3ad19800626e1d9f2d975e436f84e19efb7fa19469b",
"sha256:f3ddf056d3ebcf6ce47bdaf56142af51bb7fad09e4af310241e9db7a3a8022e1",
"sha256:f8fe4984b431f8621ca53d9380901f62bfb54ff759a1348cd140490ada7b693c",
"sha256:fe439416eb6380de434886b00c859304338f8b19f6f54811984f3420a2e03858"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.9'",
"version": "==7.4.0" "version": "==7.6.4"
}, },
"exceptiongroup": { "exceptiongroup": {
"hashes": [ "hashes": [
"sha256:4bfd3996ac73b41e9b9628b04e079f193850720ea5945fc96a08633c66912f14", "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b",
"sha256:91f5c769735f051a4290d52edd0858999b57e5876e9f85937691bd4c9fa3ed68" "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc"
], ],
"markers": "python_version < '3.11'", "markers": "python_version < '3.11'",
"version": "==1.2.0" "version": "==1.2.2"
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [
"sha256:33f96621059e65eec474169085dc92bf26e7b2d47366b70be2f67ab80dc25132", "sha256:049d058491e228e03e67b390f311bbf88fce2dbaa8fa673e7aea87b7198b8d38",
"sha256:a6dfbb75e03252917f2473ea9653f7cd799c3064e54d4c8140044c5c065f53c3" "sha256:597477df7860daa5aa0fdd84bf5208a043ab96b8e96ab708770ae0364dd03213"
], ],
"index": "pypi", "index": "pypi",
"version": "==7.0.0" "markers": "python_full_version >= '3.8.1'",
"version": "==7.1.1"
}, },
"iniconfig": { "iniconfig": {
"hashes": [ "hashes": [
@ -344,27 +305,27 @@
}, },
"packaging": { "packaging": {
"hashes": [ "hashes": [
"sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5", "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759",
"sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7" "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f"
], ],
"markers": "python_version >= '3.7'", "markers": "python_version >= '3.8'",
"version": "==23.2" "version": "==24.2"
}, },
"pluggy": { "pluggy": {
"hashes": [ "hashes": [
"sha256:cf61ae8f126ac6f7c451172cf30e3e43d3ca77615509771b3a984a0730651e12", "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1",
"sha256:d89c696a773f8bd377d18e5ecda92b7a3793cbe66c87060a6fb58c7b6e1061f7" "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==1.3.0" "version": "==1.5.0"
}, },
"pycodestyle": { "pycodestyle": {
"hashes": [ "hashes": [
"sha256:41ba0e7afc9752dfb53ced5489e89f8186be00e599e712660695b7a75ff2663f", "sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3",
"sha256:44fe31000b2d866f2e41841b18528a505fbd7fef9017b04eff4e2648a0fadc67" "sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==2.11.1" "version": "==2.12.1"
}, },
"pyflakes": { "pyflakes": {
"hashes": [ "hashes": [
@ -376,35 +337,37 @@
}, },
"pytest": { "pytest": {
"hashes": [ "hashes": [
"sha256:42ed2f917ded90ceb752dbe2ecb48c436c2a70d38bc16018c2d11da6426a18b6", "sha256:70b98107bd648308a7952b06e6ca9a50bc660be218d53c257cc1fc94fda10181",
"sha256:efc82dc5e6f2f41ae5acb9eabdf2ced192f336664c436b24a7db2c6aaafe4efd" "sha256:a6853c7375b2663155079443d2e45de913a911a11d669df02a50814944db57b2"
], ],
"index": "pypi", "index": "pypi",
"version": "==8.0.0rc2" "markers": "python_version >= '3.8'",
"version": "==8.3.3"
}, },
"pytest-cov": { "pytest-cov": {
"hashes": [ "hashes": [
"sha256:3904b13dfbfec47f003b8e77fd5b589cd11904a21ddf1ab38a64f204d6a10ef6", "sha256:eee6f1b9e61008bd34975a4d5bab25801eb31898b032dd55addc93e96fcaaa35",
"sha256:6ba70b9e97e69fcc3fb45bfeab2d0a138fb65c4d0d6a41ef33983ad114be8c3a" "sha256:fde0b595ca248bb8e2d76f020b465f3b107c9632e6a1d1705f17834c89dcadc0"
], ],
"index": "pypi", "index": "pypi",
"version": "==4.1.0" "markers": "python_version >= '3.9'",
"version": "==6.0.0"
}, },
"python-dateutil": { "python-dateutil": {
"hashes": [ "hashes": [
"sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86", "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3",
"sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9" "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"
], ],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==2.8.2" "version": "==2.9.0.post0"
}, },
"s3transfer": { "s3transfer": {
"hashes": [ "hashes": [
"sha256:3cdb40f5cfa6966e812209d0994f2a4709b561c88e90cf00c2696d2df4e56b2e", "sha256:263ed587a5803c6c708d3ce44dc4dfedaab4c1a32e8329bab818933d79ddcf5d",
"sha256:d0c8bbf672d5eebbe4e57945e23b972d963f07d82f661cabf678a5c88831595b" "sha256:4f50ed74ab84d474ce614475e0b8d5047ff080810aac5d01ea25231cfc944b0c"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==0.10.0" "version": "==0.10.3"
}, },
"six": { "six": {
"hashes": [ "hashes": [
@ -416,19 +379,19 @@
}, },
"tomli": { "tomli": {
"hashes": [ "hashes": [
"sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc", "sha256:2ebe24485c53d303f690b0ec092806a085f07af5a5aa1464f3931eec36caaa38",
"sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f" "sha256:d46d457a85337051c36524bc5349dd91b1877838e2979ac5ced3e710ed8a60ed"
], ],
"markers": "python_version < '3.11'", "markers": "python_version < '3.11'",
"version": "==2.0.1" "version": "==2.0.2"
}, },
"urllib3": { "urllib3": {
"hashes": [ "hashes": [
"sha256:34b97092d7e0a3a8cf7cd10e386f401b3737364026c45e622aa02903dffe0f07", "sha256:0ed14ccfbf1c30a9072c7ca157e4319b70d65f623e91e7b32fadb2853431016e",
"sha256:f8ecc1bba5667413457c529ab955bf8c67b45db799d159066261719e328580a0" "sha256:40c2dc0c681e47eb8f90e7e27bf6ff7df2e677421fd46756da1161c39ca70d32"
], ],
"markers": "python_version < '3.10'", "markers": "python_version < '3.10'",
"version": "==1.26.18" "version": "==1.26.20"
} }
} }
} }

View File

@ -20,8 +20,8 @@ def exec():
db = Database.get_instance() db = Database.get_instance()
try: try:
db.connect() db.connect()
db.to_jst()
db.begin() db.begin()
db.to_jst()
logger.debug('DCF施設統合マスタ作成処理開始') logger.debug('DCF施設統合マスタ作成処理開始')
# COM施設からDCF施設統合マスタに登録 # COM施設からDCF施設統合マスタに登録
(is_add_dcf_inst_merge, duplication_inst_records) = _insert_dcf_inst_merge_from_com_inst(db) (is_add_dcf_inst_merge, duplication_inst_records) = _insert_dcf_inst_merge_from_com_inst(db)

View File

@ -1,4 +1,5 @@
from datetime import datetime, timedelta from datetime import datetime, timedelta
from src.batch.batch_functions import logging_sql from src.batch.batch_functions import logging_sql
from src.batch.common.batch_context import BatchContext from src.batch.common.batch_context import BatchContext
from src.db.database import Database from src.db.database import Database
@ -14,8 +15,8 @@ def exec():
db = Database.get_instance() db = Database.get_instance()
try: try:
db.connect() db.connect()
db.to_jst()
db.begin() db.begin()
db.to_jst()
logger.debug('DCF施設統合マスタ日次更新処理開始') logger.debug('DCF施設統合マスタ日次更新処理開始')
# DCF施設統合マスタ移行先コードのセット(無効フラグが『0(有効)』) # DCF施設統合マスタ移行先コードのセット(無効フラグが『0(有効)』)
enabled_dst_inst_merge_records = _set_enabled_dct_inst_merge(db) enabled_dst_inst_merge_records = _set_enabled_dct_inst_merge(db)
@ -120,12 +121,13 @@ def _add_ult_ident_presc(db: Database, enabled_dst_inst_merge_records: list[dict
logger.info('納入先処方元マスタの登録 終了') logger.info('納入先処方元マスタの登録 終了')
def _select_emp_chg_inst_ta_cd(db: Database, dcf_inst_cd: str) -> list[dict]: def _select_primary_key_from_emp_chg_inst(db: Database, dcf_inst_cd: str) -> list[dict]:
# 従業員担当施設マスタから、DCF施設コードに対応した領域コードの取得 # 従業員担当施設マスタから、DCF施設コードに対応した領域コードと担当者種別コードの取得
try: try:
sql = """ sql = """
SELECT SELECT
ta_cd ta_cd,
emp_chg_type_cd
FROM FROM
src05.emp_chg_inst src05.emp_chg_inst
WHERE WHERE
@ -134,14 +136,14 @@ def _select_emp_chg_inst_ta_cd(db: Database, dcf_inst_cd: str) -> list[dict]:
AND (SELECT ht.syor_date FROM src05.hdke_tbl AS ht) < end_date AND (SELECT ht.syor_date FROM src05.hdke_tbl AS ht) < end_date
""" """
params = {'dcf_inst_cd': dcf_inst_cd} params = {'dcf_inst_cd': dcf_inst_cd}
emp_chg_inst_ta_cd_records = db.execute_select(sql, params) emp_chg_inst_primary_key_records = db.execute_select(sql, params)
logging_sql(logger, sql) logging_sql(logger, sql)
logger.info('従業員担当施設マスタから領域コードの取得に成功') logger.info('従業員担当施設マスタから領域コード、担当者種別コードの取得に成功')
except Exception as e: except Exception as e:
logger.debug('従業員担当施設マスタから領域コードの取得に失敗') logger.debug('従業員担当施設マスタから領域コード、担当者種別コードの取得に失敗')
raise e raise e
return emp_chg_inst_ta_cd_records return emp_chg_inst_primary_key_records
def _add_emp_chg_inst(db: Database, enabled_dst_inst_merge_records: list[dict]): def _add_emp_chg_inst(db: Database, enabled_dst_inst_merge_records: list[dict]):
@ -149,10 +151,10 @@ def _add_emp_chg_inst(db: Database, enabled_dst_inst_merge_records: list[dict]):
logger.info('従業員担当施設マスタの登録 開始') logger.info('従業員担当施設マスタの登録 開始')
for enabled_merge_record in enabled_dst_inst_merge_records: for enabled_merge_record in enabled_dst_inst_merge_records:
tekiyo_month_first_day = _get_first_day_of_month(enabled_merge_record['tekiyo_month']) tekiyo_month_first_day = _get_first_day_of_month(enabled_merge_record['tekiyo_month'])
emp_chg_inst_ta_cd_records = _select_emp_chg_inst_ta_cd(db, enabled_merge_record['dcf_inst_cd']) emp_chg_inst_primary_key_records = _select_primary_key_from_emp_chg_inst(db, enabled_merge_record['dcf_inst_cd'])
for emp_chg_inst_ta_cd_record in emp_chg_inst_ta_cd_records: for emp_chg_inst_primary_key_record in emp_chg_inst_primary_key_records:
emp_chg_inst_records = _select_emp_chg_inst(db, enabled_merge_record['dcf_inst_cd'], enabled_merge_record['dup_opp_cd'], emp_chg_inst_records = _select_emp_chg_inst(db, enabled_merge_record['dcf_inst_cd'], enabled_merge_record['dup_opp_cd'],
emp_chg_inst_ta_cd_record['ta_cd']) emp_chg_inst_primary_key_record['ta_cd'], emp_chg_inst_primary_key_record['emp_chg_type_cd'])
for emp_chg_inst_row in emp_chg_inst_records: for emp_chg_inst_row in emp_chg_inst_records:
# 重複時相手先コードが存在したかのチェック # 重複時相手先コードが存在したかのチェック
if emp_chg_inst_row['opp_count'] > 0: if emp_chg_inst_row['opp_count'] > 0:
@ -173,7 +175,7 @@ def _add_emp_chg_inst(db: Database, enabled_dst_inst_merge_records: list[dict]):
emp_chg_inst_row) emp_chg_inst_row)
continue continue
# 適用開始日 ≧ DCF施設統合マスタの適用月度の1日の場合、N(論理削除レコード)に設定する # 適用開始日 ≧ DCF施設統合マスタの適用月度の1日の場合、N(論理削除レコード)に設定する
_update_emp_chg_inst_disabled(db, enabled_merge_record['dcf_inst_cd'], emp_chg_inst_row['ta_cd'], _update_emp_chg_inst_disabled(db, enabled_merge_record['dcf_inst_cd'], emp_chg_inst_row['ta_cd'], emp_chg_inst_row['emp_chg_type_cd'],
emp_chg_inst_row['start_date']) emp_chg_inst_row['start_date'])
logger.info('従業員担当施設マスタの登録 終了') logger.info('従業員担当施設マスタの登録 終了')
@ -207,7 +209,7 @@ def _delete_ult_ident_presc(db: Database, start_date: str, ult_ident_presc_row:
raise e raise e
def _update_emp_chg_inst_disabled(db: Database, dcf_inst_cd: str, ta_cd: str, start_date: str): def _update_emp_chg_inst_disabled(db: Database, dcf_inst_cd: str, ta_cd: str, emp_chg_type_cd: str, start_date: str):
# emp_chg_instをUPDATE # emp_chg_instをUPDATE
try: try:
elapsed_time = ElapsedTime() elapsed_time = ElapsedTime()
@ -221,9 +223,10 @@ def _update_emp_chg_inst_disabled(db: Database, dcf_inst_cd: str, ta_cd: str, st
WHERE WHERE
inst_cd = :dcf_inst_cd inst_cd = :dcf_inst_cd
AND ta_cd = :ta_cd AND ta_cd = :ta_cd
AND emp_chg_type_cd = :emp_chg_type_cd
AND start_date = :start_date AND start_date = :start_date
""" """
params = {'dcf_inst_cd': dcf_inst_cd, 'ta_cd': ta_cd, 'start_date': start_date} params = {'dcf_inst_cd': dcf_inst_cd, 'ta_cd': ta_cd, 'emp_chg_type_cd': emp_chg_type_cd, 'start_date': start_date}
res = db.execute(sql, params) res = db.execute(sql, params)
logging_sql(logger, sql) logging_sql(logger, sql)
logger.info(f'従業員担当施設マスタのYorNフラグ更新に成功, {res.rowcount} 行更新 ({elapsed_time.of})') logger.info(f'従業員担当施設マスタのYorNフラグ更新に成功, {res.rowcount} 行更新 ({elapsed_time.of})')
@ -246,6 +249,7 @@ def _update_emp_chg_inst_end_date(db: Database, dcf_inst_cd: str, last_end_date:
WHERE WHERE
inst_cd = :dcf_inst_cd inst_cd = :dcf_inst_cd
AND ta_cd = :ta_cd AND ta_cd = :ta_cd
AND emp_chg_type_cd = :emp_chg_type_cd
AND emp_cd = :emp_cd AND emp_cd = :emp_cd
AND bu_cd = :bu_cd AND bu_cd = :bu_cd
AND start_date = :start_date AND start_date = :start_date
@ -254,6 +258,7 @@ def _update_emp_chg_inst_end_date(db: Database, dcf_inst_cd: str, last_end_date:
'end_date': last_end_date, 'end_date': last_end_date,
'dcf_inst_cd': dcf_inst_cd, 'dcf_inst_cd': dcf_inst_cd,
'ta_cd': emp_chg_inst_row['ta_cd'], 'ta_cd': emp_chg_inst_row['ta_cd'],
'emp_chg_type_cd': emp_chg_inst_row['emp_chg_type_cd'],
'emp_cd': emp_chg_inst_row['emp_cd'], 'emp_cd': emp_chg_inst_row['emp_cd'],
'bu_cd': emp_chg_inst_row['bu_cd'], 'bu_cd': emp_chg_inst_row['bu_cd'],
'start_date': emp_chg_inst_row['start_date'] 'start_date': emp_chg_inst_row['start_date']
@ -276,6 +281,7 @@ def _insert_emp_chg_inst(db: Database, dup_opp_cd: str, set_start_date: str,
src05.emp_chg_inst( src05.emp_chg_inst(
inst_cd, inst_cd,
ta_cd, ta_cd,
emp_chg_type_cd,
emp_cd, emp_cd,
bu_cd, bu_cd,
start_date, start_date,
@ -290,6 +296,7 @@ def _insert_emp_chg_inst(db: Database, dup_opp_cd: str, set_start_date: str,
VALUES( VALUES(
:dup_opp_cd, :dup_opp_cd,
:ta_cd, :ta_cd,
:emp_chg_type_cd,
:emp_cd, :emp_cd,
:bu_cd, :bu_cd,
:start_date, :start_date,
@ -305,6 +312,7 @@ def _insert_emp_chg_inst(db: Database, dup_opp_cd: str, set_start_date: str,
params = { params = {
'dup_opp_cd': dup_opp_cd, 'dup_opp_cd': dup_opp_cd,
'ta_cd': emp_chg_inst_row['ta_cd'], 'ta_cd': emp_chg_inst_row['ta_cd'],
'emp_chg_type_cd': emp_chg_inst_row['emp_chg_type_cd'],
'emp_cd': emp_chg_inst_row['emp_cd'], 'emp_cd': emp_chg_inst_row['emp_cd'],
'bu_cd': emp_chg_inst_row['bu_cd'], 'bu_cd': emp_chg_inst_row['bu_cd'],
'start_date': set_start_date, 'start_date': set_start_date,
@ -518,13 +526,14 @@ def _insert_ult_ident_presc(db: Database, set_Start_Date: str, dup_opp_cd: str,
raise e raise e
def _select_emp_chg_inst(db: Database, dcf_inst_cd: str, dup_opp_cd: str, ta_cd: str) -> list[dict]: def _select_emp_chg_inst(db: Database, dcf_inst_cd: str, dup_opp_cd: str, ta_cd: str, emp_chg_type_cd: str) -> list[dict]:
# emp_chg_instからSELECT # emp_chg_instからSELECT
try: try:
sql = """ sql = """
SELECT SELECT
eci.inst_cd, eci.inst_cd,
eci.ta_cd, eci.ta_cd,
eci.emp_chg_type_cd,
eci.emp_cd, eci.emp_cd,
eci.bu_cd, eci.bu_cd,
eci.start_date, eci.start_date,
@ -539,16 +548,18 @@ def _select_emp_chg_inst(db: Database, dcf_inst_cd: str, dup_opp_cd: str, ta_cd:
WHERE WHERE
eciopp.inst_cd = :dup_opp_cd eciopp.inst_cd = :dup_opp_cd
AND eciopp.ta_cd = :ta_cd AND eciopp.ta_cd = :ta_cd
AND eciopp.emp_chg_type_cd = :emp_chg_type_cd
) AS opp_count ) AS opp_count
FROM FROM
src05.emp_chg_inst AS eci src05.emp_chg_inst AS eci
WHERE WHERE
eci.inst_cd = :dcf_inst_cd eci.inst_cd = :dcf_inst_cd
AND eci.ta_cd = :ta_cd AND eci.ta_cd = :ta_cd
AND eci.emp_chg_type_cd = :emp_chg_type_cd
AND eci.enabled_flg = 'Y' AND eci.enabled_flg = 'Y'
AND (SELECT ht.syor_date FROM src05.hdke_tbl AS ht) < eci.end_date AND (SELECT ht.syor_date FROM src05.hdke_tbl AS ht) < eci.end_date
""" """
params = {'dcf_inst_cd': dcf_inst_cd, 'dup_opp_cd': dup_opp_cd, 'ta_cd': ta_cd} params = {'dcf_inst_cd': dcf_inst_cd, 'dup_opp_cd': dup_opp_cd, 'ta_cd': ta_cd, 'emp_chg_type_cd': emp_chg_type_cd}
emp_chg_inst_records = db.execute_select(sql, params) emp_chg_inst_records = db.execute_select(sql, params)
logging_sql(logger, sql) logging_sql(logger, sql)
logger.info('従業員担当施設マスタの取得 成功') logger.info('従業員担当施設マスタの取得 成功')

View File

@ -53,7 +53,9 @@ def _insert_into_emp_chg_inst_lau_from_emp_chg_inst(db: Database):
src05.emp_chg_inst_lau src05.emp_chg_inst_lau
SELECT SELECT
inst_cd, inst_cd,
ta_cd,emp_cd, ta_cd,
emp_chg_type_cd,
emp_cd,
bu_cd, bu_cd,
start_date, start_date,
end_date, end_date,

View File

@ -86,7 +86,14 @@ def _insert_mst_inst_from_fcl_mst_v(db: Database):
END AS address, END AS address,
fmv1.postal_cd, fmv1.postal_cd,
fmv1.tel_num, fmv1.tel_num,
LEFT(fmv1.closed_dt, 10), CASE
WHEN
fmv1.fcl_type BETWEEN '20' AND '29' THEN LEFT(fmv1.closed_dt, 10)
WHEN
fmv1.fcl_type IN ('A1', 'A0') AND fmv1.end_date != '9999-12-31' THEN DATE_FORMAT(fmv1.end_date, "%Y-%m-%d")
ELSE
null
END AS delete_date,
fmv1.v_inst_cd, fmv1.v_inst_cd,
fmv1.ins_dt, fmv1.ins_dt,
fmv1.upd_dt fmv1.upd_dt

View File

@ -1,8 +1,9 @@
from src.batch.common.batch_context import BatchContext from src.batch.common.batch_context import BatchContext
from src.batch.laundering import (
create_inst_merge_for_laundering, emp_chg_inst_laundering,
ult_ident_presc_laundering, sales_results_laundering)
from src.batch.dcf_inst_merge import integrate_dcf_inst_merge from src.batch.dcf_inst_merge import integrate_dcf_inst_merge
from src.batch.laundering import (create_inst_merge_for_laundering,
emp_chg_inst_laundering,
sales_results_laundering,
ult_ident_presc_laundering)
from src.logging.get_logger import get_logger from src.logging.get_logger import get_logger
batch_context = BatchContext.get_instance() batch_context = BatchContext.get_instance()

View File

@ -16,166 +16,115 @@
] ]
}, },
"default": { "default": {
"greenlet": {
"hashes": [
"sha256:01bc7ea167cf943b4c802068e178bbf70ae2e8c080467070d01bfa02f337ee67",
"sha256:0448abc479fab28b00cb472d278828b3ccca164531daab4e970a0458786055d6",
"sha256:086152f8fbc5955df88382e8a75984e2bb1c892ad2e3c80a2508954e52295257",
"sha256:098d86f528c855ead3479afe84b49242e174ed262456c342d70fc7f972bc13c4",
"sha256:149e94a2dd82d19838fe4b2259f1b6b9957d5ba1b25640d2380bea9c5df37676",
"sha256:1551a8195c0d4a68fac7a4325efac0d541b48def35feb49d803674ac32582f61",
"sha256:15d79dd26056573940fcb8c7413d84118086f2ec1a8acdfa854631084393efcc",
"sha256:1996cb9306c8595335bb157d133daf5cf9f693ef413e7673cb07e3e5871379ca",
"sha256:1a7191e42732df52cb5f39d3527217e7ab73cae2cb3694d241e18f53d84ea9a7",
"sha256:1ea188d4f49089fc6fb283845ab18a2518d279c7cd9da1065d7a84e991748728",
"sha256:1f672519db1796ca0d8753f9e78ec02355e862d0998193038c7073045899f305",
"sha256:2516a9957eed41dd8f1ec0c604f1cdc86758b587d964668b5b196a9db5bfcde6",
"sha256:2797aa5aedac23af156bbb5a6aa2cd3427ada2972c828244eb7d1b9255846379",
"sha256:2dd6e660effd852586b6a8478a1d244b8dc90ab5b1321751d2ea15deb49ed414",
"sha256:3ddc0f794e6ad661e321caa8d2f0a55ce01213c74722587256fb6566049a8b04",
"sha256:3ed7fb269f15dc662787f4119ec300ad0702fa1b19d2135a37c2c4de6fadfd4a",
"sha256:419b386f84949bf0e7c73e6032e3457b82a787c1ab4a0e43732898a761cc9dbf",
"sha256:43374442353259554ce33599da8b692d5aa96f8976d567d4badf263371fbe491",
"sha256:52f59dd9c96ad2fc0d5724107444f76eb20aaccb675bf825df6435acb7703559",
"sha256:57e8974f23e47dac22b83436bdcf23080ade568ce77df33159e019d161ce1d1e",
"sha256:5b51e85cb5ceda94e79d019ed36b35386e8c37d22f07d6a751cb659b180d5274",
"sha256:649dde7de1a5eceb258f9cb00bdf50e978c9db1b996964cd80703614c86495eb",
"sha256:64d7675ad83578e3fc149b617a444fab8efdafc9385471f868eb5ff83e446b8b",
"sha256:68834da854554926fbedd38c76e60c4a2e3198c6fbed520b106a8986445caaf9",
"sha256:6b66c9c1e7ccabad3a7d037b2bcb740122a7b17a53734b7d72a344ce39882a1b",
"sha256:70fb482fdf2c707765ab5f0b6655e9cfcf3780d8d87355a063547b41177599be",
"sha256:7170375bcc99f1a2fbd9c306f5be8764eaf3ac6b5cb968862cad4c7057756506",
"sha256:73a411ef564e0e097dbe7e866bb2dda0f027e072b04da387282b02c308807405",
"sha256:77457465d89b8263bca14759d7c1684df840b6811b2499838cc5b040a8b5b113",
"sha256:7f362975f2d179f9e26928c5b517524e89dd48530a0202570d55ad6ca5d8a56f",
"sha256:81bb9c6d52e8321f09c3d165b2a78c680506d9af285bfccbad9fb7ad5a5da3e5",
"sha256:881b7db1ebff4ba09aaaeae6aa491daeb226c8150fc20e836ad00041bcb11230",
"sha256:894393ce10ceac937e56ec00bb71c4c2f8209ad516e96033e4b3b1de270e200d",
"sha256:99bf650dc5d69546e076f413a87481ee1d2d09aaaaaca058c9251b6d8c14783f",
"sha256:9da2bd29ed9e4f15955dd1595ad7bc9320308a3b766ef7f837e23ad4b4aac31a",
"sha256:afaff6cf5200befd5cec055b07d1c0a5a06c040fe5ad148abcd11ba6ab9b114e",
"sha256:b1b5667cced97081bf57b8fa1d6bfca67814b0afd38208d52538316e9422fc61",
"sha256:b37eef18ea55f2ffd8f00ff8fe7c8d3818abd3e25fb73fae2ca3b672e333a7a6",
"sha256:b542be2440edc2d48547b5923c408cbe0fc94afb9f18741faa6ae970dbcb9b6d",
"sha256:b7dcbe92cc99f08c8dd11f930de4d99ef756c3591a5377d1d9cd7dd5e896da71",
"sha256:b7f009caad047246ed379e1c4dbcb8b020f0a390667ea74d2387be2998f58a22",
"sha256:bba5387a6975598857d86de9eac14210a49d554a77eb8261cc68b7d082f78ce2",
"sha256:c5e1536de2aad7bf62e27baf79225d0d64360d4168cf2e6becb91baf1ed074f3",
"sha256:c5ee858cfe08f34712f548c3c363e807e7186f03ad7a5039ebadb29e8c6be067",
"sha256:c9db1c18f0eaad2f804728c67d6c610778456e3e1cc4ab4bbd5eeb8e6053c6fc",
"sha256:d353cadd6083fdb056bb46ed07e4340b0869c305c8ca54ef9da3421acbdf6881",
"sha256:d46677c85c5ba00a9cb6f7a00b2bfa6f812192d2c9f7d9c4f6a55b60216712f3",
"sha256:d4d1ac74f5c0c0524e4a24335350edad7e5f03b9532da7ea4d3c54d527784f2e",
"sha256:d73a9fe764d77f87f8ec26a0c85144d6a951a6c438dfe50487df5595c6373eac",
"sha256:da70d4d51c8b306bb7a031d5cff6cc25ad253affe89b70352af5f1cb68e74b53",
"sha256:daf3cb43b7cf2ba96d614252ce1684c1bccee6b2183a01328c98d36fcd7d5cb0",
"sha256:dca1e2f3ca00b84a396bc1bce13dd21f680f035314d2379c4160c98153b2059b",
"sha256:dd4f49ae60e10adbc94b45c0b5e6a179acc1736cf7a90160b404076ee283cf83",
"sha256:e1f145462f1fa6e4a4ae3c0f782e580ce44d57c8f2c7aae1b6fa88c0b2efdb41",
"sha256:e3391d1e16e2a5a1507d83e4a8b100f4ee626e8eca43cf2cadb543de69827c4c",
"sha256:fcd2469d6a2cf298f198f0487e0a5b1a47a42ca0fa4dfd1b6862c999f018ebbf",
"sha256:fd096eb7ffef17c456cfa587523c5f92321ae02427ff955bebe9e3c63bc9f0da",
"sha256:fe754d231288e1e64323cfad462fcee8f0288654c10bdf4f603a39ed923bef33"
],
"markers": "platform_machine == 'aarch64' or (platform_machine == 'ppc64le' or (platform_machine == 'x86_64' or (platform_machine == 'amd64' or (platform_machine == 'AMD64' or (platform_machine == 'win32' or platform_machine == 'WIN32')))))",
"version": "==3.0.3"
},
"pymysql": { "pymysql": {
"hashes": [ "hashes": [
"sha256:4f13a7df8bf36a51e81dd9f3605fede45a4878fe02f9236349fd82a3f0612f96", "sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c",
"sha256:8969ec6d763c856f7073c4c64662882675702efcb114b4bcbb955aea3a069fa7" "sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0"
], ],
"index": "pypi", "index": "pypi",
"version": "==1.1.0" "markers": "python_version >= '3.7'",
"version": "==1.1.1"
}, },
"sqlalchemy": { "sqlalchemy": {
"hashes": [ "hashes": [
"sha256:0d3cab3076af2e4aa5693f89622bef7fa770c6fec967143e4da7508b3dceb9b9", "sha256:023b3ee6169969beea3bb72312e44d8b7c27c75b347942d943cf49397b7edeb5",
"sha256:0dacf67aee53b16f365c589ce72e766efaabd2b145f9de7c917777b575e3659d", "sha256:03968a349db483936c249f4d9cd14ff2c296adfa1290b660ba6516f973139582",
"sha256:10331f129982a19df4284ceac6fe87353ca3ca6b4ca77ff7d697209ae0a5915e", "sha256:05132c906066142103b83d9c250b60508af556982a385d96c4eaa9fb9720ac2b",
"sha256:14a6f68e8fc96e5e8f5647ef6cda6250c780612a573d99e4d881581432ef1669", "sha256:087b6b52de812741c27231b5a3586384d60c353fbd0e2f81405a814b5591dc8b",
"sha256:1b1180cda6df7af84fe72e4530f192231b1f29a7496951db4ff38dac1687202d", "sha256:0b3dbf1e7e9bc95f4bac5e2fb6d3fb2f083254c3fdd20a1789af965caf2d2348",
"sha256:29049e2c299b5ace92cbed0c1610a7a236f3baf4c6b66eb9547c01179f638ec5", "sha256:118c16cd3f1b00c76d69343e38602006c9cfb9998fa4f798606d28d63f23beda",
"sha256:342d365988ba88ada8af320d43df4e0b13a694dbd75951f537b2d5e4cb5cd002", "sha256:1936af879e3db023601196a1684d28e12f19ccf93af01bf3280a3262c4b6b4e5",
"sha256:420362338681eec03f53467804541a854617faed7272fe71a1bfdb07336a381e", "sha256:1e3f196a0c59b0cae9a0cd332eb1a4bda4696e863f4f1cf84ab0347992c548c2",
"sha256:4344d059265cc8b1b1be351bfb88749294b87a8b2bbe21dfbe066c4199541ebd", "sha256:23a8825495d8b195c4aa9ff1c430c28f2c821e8c5e2d98089228af887e5d7e29",
"sha256:4f7a7d7fcc675d3d85fbf3b3828ecd5990b8d61bd6de3f1b260080b3beccf215", "sha256:293cd444d82b18da48c9f71cd7005844dbbd06ca19be1ccf6779154439eec0b8",
"sha256:555651adbb503ac7f4cb35834c5e4ae0819aab2cd24857a123370764dc7d7e24", "sha256:32f9dc8c44acdee06c8fc6440db9eae8b4af8b01e4b1aee7bdd7241c22edff4f",
"sha256:59a21853f5daeb50412d459cfb13cb82c089ad4c04ec208cd14dddd99fc23b39", "sha256:34ea30ab3ec98355235972dadc497bb659cc75f8292b760394824fab9cf39826",
"sha256:5fdd402169aa00df3142149940b3bf9ce7dde075928c1886d9a1df63d4b8de62", "sha256:3d3549fc3e40667ec7199033a4e40a2f669898a00a7b18a931d3efb4c7900504",
"sha256:605b6b059f4b57b277f75ace81cc5bc6335efcbcc4ccb9066695e515dbdb3900", "sha256:41836fe661cc98abfae476e14ba1906220f92c4e528771a8a3ae6a151242d2ae",
"sha256:665f0a3954635b5b777a55111ababf44b4fc12b1f3ba0a435b602b6387ffd7cf", "sha256:4d44522480e0bf34c3d63167b8cfa7289c1c54264c2950cc5fc26e7850967e45",
"sha256:6f9e2e59cbcc6ba1488404aad43de005d05ca56e069477b33ff74e91b6319735", "sha256:4eeb195cdedaf17aab6b247894ff2734dcead6c08f748e617bfe05bd5a218443",
"sha256:736ea78cd06de6c21ecba7416499e7236a22374561493b456a1f7ffbe3f6cdb4", "sha256:4f67766965996e63bb46cfbf2ce5355fc32d9dd3b8ad7e536a920ff9ee422e23",
"sha256:74b080c897563f81062b74e44f5a72fa44c2b373741a9ade701d5f789a10ba23", "sha256:57df5dc6fdb5ed1a88a1ed2195fd31927e705cad62dedd86b46972752a80f576",
"sha256:75432b5b14dc2fff43c50435e248b45c7cdadef73388e5610852b95280ffd0e9", "sha256:598d9ebc1e796431bbd068e41e4de4dc34312b7aa3292571bb3674a0cb415dd1",
"sha256:75f99202324383d613ddd1f7455ac908dca9c2dd729ec8584c9541dd41822a2c", "sha256:5b14e97886199c1f52c14629c11d90c11fbb09e9334fa7bb5f6d068d9ced0ce0",
"sha256:790f533fa5c8901a62b6fef5811d48980adeb2f51f1290ade8b5e7ba990ba3de", "sha256:5e22575d169529ac3e0a120cf050ec9daa94b6a9597993d1702884f6954a7d71",
"sha256:798f717ae7c806d67145f6ae94dc7c342d3222d3b9a311a784f371a4333212c7", "sha256:60c578c45c949f909a4026b7807044e7e564adf793537fc762b2489d522f3d11",
"sha256:7c88f0c7dcc5f99bdb34b4fd9b69b93c89f893f454f40219fe923a3a2fd11625", "sha256:6145afea51ff0af7f2564a05fa95eb46f542919e6523729663a5d285ecb3cf5e",
"sha256:7d505815ac340568fd03f719446a589162d55c52f08abd77ba8964fbb7eb5b5f", "sha256:6375cd674fe82d7aa9816d1cb96ec592bac1726c11e0cafbf40eeee9a4516b5f",
"sha256:84daa0a2055df9ca0f148a64fdde12ac635e30edbca80e87df9b3aaf419e144a", "sha256:6854175807af57bdb6425e47adbce7d20a4d79bbfd6f6d6519cd10bb7109a7f8",
"sha256:87d91043ea0dc65ee583026cb18e1b458d8ec5fc0a93637126b5fc0bc3ea68c4", "sha256:6ab60a5089a8f02009f127806f777fca82581c49e127f08413a66056bd9166dd",
"sha256:87f6e732bccd7dcf1741c00f1ecf33797383128bd1c90144ac8adc02cbb98643", "sha256:725875a63abf7c399d4548e686debb65cdc2549e1825437096a0af1f7e374814",
"sha256:884272dcd3ad97f47702965a0e902b540541890f468d24bd1d98bcfe41c3f018", "sha256:7492967c3386df69f80cf67efd665c0f667cee67032090fe01d7d74b0e19bb08",
"sha256:8b8cb63d3ea63b29074dcd29da4dc6a97ad1349151f2d2949495418fd6e48db9", "sha256:81965cc20848ab06583506ef54e37cf15c83c7e619df2ad16807c03100745dea",
"sha256:91f7d9d1c4dd1f4f6e092874c128c11165eafcf7c963128f79e28f8445de82d5", "sha256:81c24e0c0fde47a9723c81d5806569cddef103aebbf79dbc9fcbb617153dea30",
"sha256:a2c69a7664fb2d54b8682dd774c3b54f67f84fa123cf84dda2a5f40dcaa04e08", "sha256:81eedafa609917040d39aa9332e25881a8e7a0862495fcdf2023a9667209deda",
"sha256:a3be4987e3ee9d9a380b66393b77a4cd6d742480c951a1c56a23c335caca4ce3", "sha256:81f413674d85cfd0dfcd6512e10e0f33c19c21860342a4890c3a2b59479929f9",
"sha256:a86b4240e67d4753dc3092d9511886795b3c2852abe599cffe108952f7af7ac3", "sha256:8280856dd7c6a68ab3a164b4a4b1c51f7691f6d04af4d4ca23d6ecf2261b7923",
"sha256:aa9373708763ef46782d10e950b49d0235bfe58facebd76917d3f5cbf5971aed", "sha256:82ca366a844eb551daff9d2e6e7a9e5e76d2612c8564f58db6c19a726869c1df",
"sha256:b64b183d610b424a160b0d4d880995e935208fc043d0302dd29fee32d1ee3f95", "sha256:8b4af17bda11e907c51d10686eda89049f9ce5669b08fbe71a29747f1e876036",
"sha256:b801154027107461ee992ff4b5c09aa7cc6ec91ddfe50d02bca344918c3265c6", "sha256:90144d3b0c8b139408da50196c5cad2a6909b51b23df1f0538411cd23ffa45d3",
"sha256:bb209a73b8307f8fe4fe46f6ad5979649be01607f11af1eb94aa9e8a3aaf77f0", "sha256:906e6b0d7d452e9a98e5ab8507c0da791856b2380fdee61b765632bb8698026f",
"sha256:bc8b7dabe8e67c4832891a5d322cec6d44ef02f432b4588390017f5cec186a84", "sha256:90c11ceb9a1f482c752a71f203a81858625d8df5746d787a4786bca4ffdf71c6",
"sha256:c51db269513917394faec5e5c00d6f83829742ba62e2ac4fa5c98d58be91662f", "sha256:911cc493ebd60de5f285bcae0491a60b4f2a9f0f5c270edd1c4dbaef7a38fc04",
"sha256:c55731c116806836a5d678a70c84cb13f2cedba920212ba7dcad53260997666d", "sha256:9a420a91913092d1e20c86a2f5f1fc85c1a8924dbcaf5e0586df8aceb09c9cc2",
"sha256:cf18ff7fc9941b8fc23437cc3e68ed4ebeff3599eec6ef5eebf305f3d2e9a7c2", "sha256:9f8c9fdd15a55d9465e590a402f42082705d66b05afc3ffd2d2eb3c6ba919560",
"sha256:d24f571990c05f6b36a396218f251f3e0dda916e0c687ef6fdca5072743208f5", "sha256:a104c5694dfd2d864a6f91b0956eb5d5883234119cb40010115fd45a16da5e70",
"sha256:db854730a25db7c956423bb9fb4bdd1216c839a689bf9cc15fada0a7fb2f4570", "sha256:a373a400f3e9bac95ba2a06372c4fd1412a7cee53c37fc6c05f829bf672b8769",
"sha256:dc55990143cbd853a5d038c05e79284baedf3e299661389654551bd02a6a68d7", "sha256:a62448526dd9ed3e3beedc93df9bb6b55a436ed1474db31a2af13b313a70a7e1",
"sha256:e607cdd99cbf9bb80391f54446b86e16eea6ad309361942bf88318bcd452363c", "sha256:a8808d5cf866c781150d36a3c8eb3adccfa41a8105d031bf27e92c251e3969d6",
"sha256:ecf6d4cda1f9f6cb0b45803a01ea7f034e2f1aed9475e883410812d9f9e3cfcf", "sha256:b1f09b6821406ea1f94053f346f28f8215e293344209129a9c0fcc3578598d7b",
"sha256:f2a159111a0f58fb034c93eeba211b4141137ec4b0a6e75789ab7a3ef3c7e7e3", "sha256:b2ac41acfc8d965fb0c464eb8f44995770239668956dc4cdf502d1b1ffe0d747",
"sha256:f37c0caf14b9e9b9e8f6dbc81bc56db06acb4363eba5a633167781a48ef036ed", "sha256:b46fa6eae1cd1c20e6e6f44e19984d438b6b2d8616d21d783d150df714f44078",
"sha256:f5693145220517b5f42393e07a6898acdfe820e136c98663b971906120549da5" "sha256:b50eab9994d64f4a823ff99a0ed28a6903224ddbe7fef56a6dd865eec9243440",
"sha256:bfc9064f6658a3d1cadeaa0ba07570b83ce6801a1314985bf98ec9b95d74e15f",
"sha256:c0b0e5e1b5d9f3586601048dd68f392dc0cc99a59bb5faf18aab057ce00d00b2",
"sha256:c153265408d18de4cc5ded1941dcd8315894572cddd3c58df5d5b5705b3fa28d",
"sha256:d4ae769b9c1c7757e4ccce94b0641bc203bbdf43ba7a2413ab2523d8d047d8dc",
"sha256:dc56c9788617b8964ad02e8fcfeed4001c1f8ba91a9e1f31483c0dffb207002a",
"sha256:dd5ec3aa6ae6e4d5b5de9357d2133c07be1aff6405b136dad753a16afb6717dd",
"sha256:edba70118c4be3c2b1f90754d308d0b79c6fe2c0fdc52d8ddf603916f83f4db9",
"sha256:ff8e80c4c4932c10493ff97028decfdb622de69cae87e0f127a7ebe32b4069c6"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.0.25" "markers": "python_version >= '3.7'",
"version": "==2.0.41"
}, },
"tenacity": { "tenacity": {
"hashes": [ "hashes": [
"sha256:5398ef0d78e63f40007c1fb4c0bff96e1911394d2fa8d194f77619c05ff6cc8a", "sha256:1169d376c297e7de388d18b4481760d478b0e99a777cad3a9c86e556f4b697cb",
"sha256:ce510e327a630c9e1beaf17d42e6ffacc88185044ad85cf74c0a8887c6a0f88c" "sha256:f77bf36710d8b73a50b2dd155c97b870017ad21afe6ab300326b0371b3b05138"
], ],
"index": "pypi", "index": "pypi",
"version": "==8.2.3" "markers": "python_version >= '3.9'",
"version": "==9.1.2"
}, },
"typing-extensions": { "typing-extensions": {
"hashes": [ "hashes": [
"sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783", "sha256:8676b788e32f02ab42d9e7c61324048ae4c6d844a399eebace3d4979d75ceef4",
"sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd" "sha256:a1514509136dd0b477638fc68d6a91497af5076466ad0fa6c338e44e359944af"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.9'",
"version": "==4.9.0" "version": "==4.14.0"
} }
}, },
"develop": { "develop": {
"autopep8": { "autopep8": {
"hashes": [ "hashes": [
"sha256:067959ca4a07b24dbd5345efa8325f5f58da4298dab0dde0443d5ed765de80cb", "sha256:8d6c87eba648fdcfc83e29b788910b8643171c395d9c4bcf115ece035b9c9dda",
"sha256:2913064abd97b3419d1cc83ea71f042cb821f87e45b9c88cad5ad3c4ea87fe0c" "sha256:a203fe0fcad7939987422140ab17a930f684763bf7335bdb6709991dd7ef6c2d"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.0.4" "markers": "python_version >= '3.8'",
"version": "==2.3.1"
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [
"sha256:33f96621059e65eec474169085dc92bf26e7b2d47366b70be2f67ab80dc25132", "sha256:049d058491e228e03e67b390f311bbf88fce2dbaa8fa673e7aea87b7198b8d38",
"sha256:a6dfbb75e03252917f2473ea9653f7cd799c3064e54d4c8140044c5c065f53c3" "sha256:597477df7860daa5aa0fdd84bf5208a043ab96b8e96ab708770ae0364dd03213"
], ],
"index": "pypi", "index": "pypi",
"version": "==7.0.0" "markers": "python_full_version >= '3.8.1'",
"version": "==7.1.1"
}, },
"mccabe": { "mccabe": {
"hashes": [ "hashes": [
@ -187,11 +136,11 @@
}, },
"pycodestyle": { "pycodestyle": {
"hashes": [ "hashes": [
"sha256:41ba0e7afc9752dfb53ced5489e89f8186be00e599e712660695b7a75ff2663f", "sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3",
"sha256:44fe31000b2d866f2e41841b18528a505fbd7fef9017b04eff4e2648a0fadc67" "sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==2.11.1" "version": "==2.12.1"
}, },
"pyflakes": { "pyflakes": {
"hashes": [ "hashes": [
@ -203,11 +152,11 @@
}, },
"tomli": { "tomli": {
"hashes": [ "hashes": [
"sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc", "sha256:2ebe24485c53d303f690b0ec092806a085f07af5a5aa1464f3931eec36caaa38",
"sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f" "sha256:d46d457a85337051c36524bc5349dd91b1877838e2979ac5ced3e710ed8a60ed"
], ],
"markers": "python_version < '3.11'", "markers": "python_version < '3.11'",
"version": "==2.0.1" "version": "==2.0.2"
} }
} }
} }

View File

@ -1,4 +1,4 @@
FROM python:3.9-bullseye FROM python:3.9
ENV TZ="Asia/Tokyo" ENV TZ="Asia/Tokyo"

View File

@ -16,166 +16,115 @@
] ]
}, },
"default": { "default": {
"greenlet": {
"hashes": [
"sha256:01bc7ea167cf943b4c802068e178bbf70ae2e8c080467070d01bfa02f337ee67",
"sha256:0448abc479fab28b00cb472d278828b3ccca164531daab4e970a0458786055d6",
"sha256:086152f8fbc5955df88382e8a75984e2bb1c892ad2e3c80a2508954e52295257",
"sha256:098d86f528c855ead3479afe84b49242e174ed262456c342d70fc7f972bc13c4",
"sha256:149e94a2dd82d19838fe4b2259f1b6b9957d5ba1b25640d2380bea9c5df37676",
"sha256:1551a8195c0d4a68fac7a4325efac0d541b48def35feb49d803674ac32582f61",
"sha256:15d79dd26056573940fcb8c7413d84118086f2ec1a8acdfa854631084393efcc",
"sha256:1996cb9306c8595335bb157d133daf5cf9f693ef413e7673cb07e3e5871379ca",
"sha256:1a7191e42732df52cb5f39d3527217e7ab73cae2cb3694d241e18f53d84ea9a7",
"sha256:1ea188d4f49089fc6fb283845ab18a2518d279c7cd9da1065d7a84e991748728",
"sha256:1f672519db1796ca0d8753f9e78ec02355e862d0998193038c7073045899f305",
"sha256:2516a9957eed41dd8f1ec0c604f1cdc86758b587d964668b5b196a9db5bfcde6",
"sha256:2797aa5aedac23af156bbb5a6aa2cd3427ada2972c828244eb7d1b9255846379",
"sha256:2dd6e660effd852586b6a8478a1d244b8dc90ab5b1321751d2ea15deb49ed414",
"sha256:3ddc0f794e6ad661e321caa8d2f0a55ce01213c74722587256fb6566049a8b04",
"sha256:3ed7fb269f15dc662787f4119ec300ad0702fa1b19d2135a37c2c4de6fadfd4a",
"sha256:419b386f84949bf0e7c73e6032e3457b82a787c1ab4a0e43732898a761cc9dbf",
"sha256:43374442353259554ce33599da8b692d5aa96f8976d567d4badf263371fbe491",
"sha256:52f59dd9c96ad2fc0d5724107444f76eb20aaccb675bf825df6435acb7703559",
"sha256:57e8974f23e47dac22b83436bdcf23080ade568ce77df33159e019d161ce1d1e",
"sha256:5b51e85cb5ceda94e79d019ed36b35386e8c37d22f07d6a751cb659b180d5274",
"sha256:649dde7de1a5eceb258f9cb00bdf50e978c9db1b996964cd80703614c86495eb",
"sha256:64d7675ad83578e3fc149b617a444fab8efdafc9385471f868eb5ff83e446b8b",
"sha256:68834da854554926fbedd38c76e60c4a2e3198c6fbed520b106a8986445caaf9",
"sha256:6b66c9c1e7ccabad3a7d037b2bcb740122a7b17a53734b7d72a344ce39882a1b",
"sha256:70fb482fdf2c707765ab5f0b6655e9cfcf3780d8d87355a063547b41177599be",
"sha256:7170375bcc99f1a2fbd9c306f5be8764eaf3ac6b5cb968862cad4c7057756506",
"sha256:73a411ef564e0e097dbe7e866bb2dda0f027e072b04da387282b02c308807405",
"sha256:77457465d89b8263bca14759d7c1684df840b6811b2499838cc5b040a8b5b113",
"sha256:7f362975f2d179f9e26928c5b517524e89dd48530a0202570d55ad6ca5d8a56f",
"sha256:81bb9c6d52e8321f09c3d165b2a78c680506d9af285bfccbad9fb7ad5a5da3e5",
"sha256:881b7db1ebff4ba09aaaeae6aa491daeb226c8150fc20e836ad00041bcb11230",
"sha256:894393ce10ceac937e56ec00bb71c4c2f8209ad516e96033e4b3b1de270e200d",
"sha256:99bf650dc5d69546e076f413a87481ee1d2d09aaaaaca058c9251b6d8c14783f",
"sha256:9da2bd29ed9e4f15955dd1595ad7bc9320308a3b766ef7f837e23ad4b4aac31a",
"sha256:afaff6cf5200befd5cec055b07d1c0a5a06c040fe5ad148abcd11ba6ab9b114e",
"sha256:b1b5667cced97081bf57b8fa1d6bfca67814b0afd38208d52538316e9422fc61",
"sha256:b37eef18ea55f2ffd8f00ff8fe7c8d3818abd3e25fb73fae2ca3b672e333a7a6",
"sha256:b542be2440edc2d48547b5923c408cbe0fc94afb9f18741faa6ae970dbcb9b6d",
"sha256:b7dcbe92cc99f08c8dd11f930de4d99ef756c3591a5377d1d9cd7dd5e896da71",
"sha256:b7f009caad047246ed379e1c4dbcb8b020f0a390667ea74d2387be2998f58a22",
"sha256:bba5387a6975598857d86de9eac14210a49d554a77eb8261cc68b7d082f78ce2",
"sha256:c5e1536de2aad7bf62e27baf79225d0d64360d4168cf2e6becb91baf1ed074f3",
"sha256:c5ee858cfe08f34712f548c3c363e807e7186f03ad7a5039ebadb29e8c6be067",
"sha256:c9db1c18f0eaad2f804728c67d6c610778456e3e1cc4ab4bbd5eeb8e6053c6fc",
"sha256:d353cadd6083fdb056bb46ed07e4340b0869c305c8ca54ef9da3421acbdf6881",
"sha256:d46677c85c5ba00a9cb6f7a00b2bfa6f812192d2c9f7d9c4f6a55b60216712f3",
"sha256:d4d1ac74f5c0c0524e4a24335350edad7e5f03b9532da7ea4d3c54d527784f2e",
"sha256:d73a9fe764d77f87f8ec26a0c85144d6a951a6c438dfe50487df5595c6373eac",
"sha256:da70d4d51c8b306bb7a031d5cff6cc25ad253affe89b70352af5f1cb68e74b53",
"sha256:daf3cb43b7cf2ba96d614252ce1684c1bccee6b2183a01328c98d36fcd7d5cb0",
"sha256:dca1e2f3ca00b84a396bc1bce13dd21f680f035314d2379c4160c98153b2059b",
"sha256:dd4f49ae60e10adbc94b45c0b5e6a179acc1736cf7a90160b404076ee283cf83",
"sha256:e1f145462f1fa6e4a4ae3c0f782e580ce44d57c8f2c7aae1b6fa88c0b2efdb41",
"sha256:e3391d1e16e2a5a1507d83e4a8b100f4ee626e8eca43cf2cadb543de69827c4c",
"sha256:fcd2469d6a2cf298f198f0487e0a5b1a47a42ca0fa4dfd1b6862c999f018ebbf",
"sha256:fd096eb7ffef17c456cfa587523c5f92321ae02427ff955bebe9e3c63bc9f0da",
"sha256:fe754d231288e1e64323cfad462fcee8f0288654c10bdf4f603a39ed923bef33"
],
"markers": "platform_machine == 'aarch64' or (platform_machine == 'ppc64le' or (platform_machine == 'x86_64' or (platform_machine == 'amd64' or (platform_machine == 'AMD64' or (platform_machine == 'win32' or platform_machine == 'WIN32')))))",
"version": "==3.0.3"
},
"pymysql": { "pymysql": {
"hashes": [ "hashes": [
"sha256:4f13a7df8bf36a51e81dd9f3605fede45a4878fe02f9236349fd82a3f0612f96", "sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c",
"sha256:8969ec6d763c856f7073c4c64662882675702efcb114b4bcbb955aea3a069fa7" "sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0"
], ],
"index": "pypi", "index": "pypi",
"version": "==1.1.0" "markers": "python_version >= '3.7'",
"version": "==1.1.1"
}, },
"sqlalchemy": { "sqlalchemy": {
"hashes": [ "hashes": [
"sha256:0d3cab3076af2e4aa5693f89622bef7fa770c6fec967143e4da7508b3dceb9b9", "sha256:023b3ee6169969beea3bb72312e44d8b7c27c75b347942d943cf49397b7edeb5",
"sha256:0dacf67aee53b16f365c589ce72e766efaabd2b145f9de7c917777b575e3659d", "sha256:03968a349db483936c249f4d9cd14ff2c296adfa1290b660ba6516f973139582",
"sha256:10331f129982a19df4284ceac6fe87353ca3ca6b4ca77ff7d697209ae0a5915e", "sha256:05132c906066142103b83d9c250b60508af556982a385d96c4eaa9fb9720ac2b",
"sha256:14a6f68e8fc96e5e8f5647ef6cda6250c780612a573d99e4d881581432ef1669", "sha256:087b6b52de812741c27231b5a3586384d60c353fbd0e2f81405a814b5591dc8b",
"sha256:1b1180cda6df7af84fe72e4530f192231b1f29a7496951db4ff38dac1687202d", "sha256:0b3dbf1e7e9bc95f4bac5e2fb6d3fb2f083254c3fdd20a1789af965caf2d2348",
"sha256:29049e2c299b5ace92cbed0c1610a7a236f3baf4c6b66eb9547c01179f638ec5", "sha256:118c16cd3f1b00c76d69343e38602006c9cfb9998fa4f798606d28d63f23beda",
"sha256:342d365988ba88ada8af320d43df4e0b13a694dbd75951f537b2d5e4cb5cd002", "sha256:1936af879e3db023601196a1684d28e12f19ccf93af01bf3280a3262c4b6b4e5",
"sha256:420362338681eec03f53467804541a854617faed7272fe71a1bfdb07336a381e", "sha256:1e3f196a0c59b0cae9a0cd332eb1a4bda4696e863f4f1cf84ab0347992c548c2",
"sha256:4344d059265cc8b1b1be351bfb88749294b87a8b2bbe21dfbe066c4199541ebd", "sha256:23a8825495d8b195c4aa9ff1c430c28f2c821e8c5e2d98089228af887e5d7e29",
"sha256:4f7a7d7fcc675d3d85fbf3b3828ecd5990b8d61bd6de3f1b260080b3beccf215", "sha256:293cd444d82b18da48c9f71cd7005844dbbd06ca19be1ccf6779154439eec0b8",
"sha256:555651adbb503ac7f4cb35834c5e4ae0819aab2cd24857a123370764dc7d7e24", "sha256:32f9dc8c44acdee06c8fc6440db9eae8b4af8b01e4b1aee7bdd7241c22edff4f",
"sha256:59a21853f5daeb50412d459cfb13cb82c089ad4c04ec208cd14dddd99fc23b39", "sha256:34ea30ab3ec98355235972dadc497bb659cc75f8292b760394824fab9cf39826",
"sha256:5fdd402169aa00df3142149940b3bf9ce7dde075928c1886d9a1df63d4b8de62", "sha256:3d3549fc3e40667ec7199033a4e40a2f669898a00a7b18a931d3efb4c7900504",
"sha256:605b6b059f4b57b277f75ace81cc5bc6335efcbcc4ccb9066695e515dbdb3900", "sha256:41836fe661cc98abfae476e14ba1906220f92c4e528771a8a3ae6a151242d2ae",
"sha256:665f0a3954635b5b777a55111ababf44b4fc12b1f3ba0a435b602b6387ffd7cf", "sha256:4d44522480e0bf34c3d63167b8cfa7289c1c54264c2950cc5fc26e7850967e45",
"sha256:6f9e2e59cbcc6ba1488404aad43de005d05ca56e069477b33ff74e91b6319735", "sha256:4eeb195cdedaf17aab6b247894ff2734dcead6c08f748e617bfe05bd5a218443",
"sha256:736ea78cd06de6c21ecba7416499e7236a22374561493b456a1f7ffbe3f6cdb4", "sha256:4f67766965996e63bb46cfbf2ce5355fc32d9dd3b8ad7e536a920ff9ee422e23",
"sha256:74b080c897563f81062b74e44f5a72fa44c2b373741a9ade701d5f789a10ba23", "sha256:57df5dc6fdb5ed1a88a1ed2195fd31927e705cad62dedd86b46972752a80f576",
"sha256:75432b5b14dc2fff43c50435e248b45c7cdadef73388e5610852b95280ffd0e9", "sha256:598d9ebc1e796431bbd068e41e4de4dc34312b7aa3292571bb3674a0cb415dd1",
"sha256:75f99202324383d613ddd1f7455ac908dca9c2dd729ec8584c9541dd41822a2c", "sha256:5b14e97886199c1f52c14629c11d90c11fbb09e9334fa7bb5f6d068d9ced0ce0",
"sha256:790f533fa5c8901a62b6fef5811d48980adeb2f51f1290ade8b5e7ba990ba3de", "sha256:5e22575d169529ac3e0a120cf050ec9daa94b6a9597993d1702884f6954a7d71",
"sha256:798f717ae7c806d67145f6ae94dc7c342d3222d3b9a311a784f371a4333212c7", "sha256:60c578c45c949f909a4026b7807044e7e564adf793537fc762b2489d522f3d11",
"sha256:7c88f0c7dcc5f99bdb34b4fd9b69b93c89f893f454f40219fe923a3a2fd11625", "sha256:6145afea51ff0af7f2564a05fa95eb46f542919e6523729663a5d285ecb3cf5e",
"sha256:7d505815ac340568fd03f719446a589162d55c52f08abd77ba8964fbb7eb5b5f", "sha256:6375cd674fe82d7aa9816d1cb96ec592bac1726c11e0cafbf40eeee9a4516b5f",
"sha256:84daa0a2055df9ca0f148a64fdde12ac635e30edbca80e87df9b3aaf419e144a", "sha256:6854175807af57bdb6425e47adbce7d20a4d79bbfd6f6d6519cd10bb7109a7f8",
"sha256:87d91043ea0dc65ee583026cb18e1b458d8ec5fc0a93637126b5fc0bc3ea68c4", "sha256:6ab60a5089a8f02009f127806f777fca82581c49e127f08413a66056bd9166dd",
"sha256:87f6e732bccd7dcf1741c00f1ecf33797383128bd1c90144ac8adc02cbb98643", "sha256:725875a63abf7c399d4548e686debb65cdc2549e1825437096a0af1f7e374814",
"sha256:884272dcd3ad97f47702965a0e902b540541890f468d24bd1d98bcfe41c3f018", "sha256:7492967c3386df69f80cf67efd665c0f667cee67032090fe01d7d74b0e19bb08",
"sha256:8b8cb63d3ea63b29074dcd29da4dc6a97ad1349151f2d2949495418fd6e48db9", "sha256:81965cc20848ab06583506ef54e37cf15c83c7e619df2ad16807c03100745dea",
"sha256:91f7d9d1c4dd1f4f6e092874c128c11165eafcf7c963128f79e28f8445de82d5", "sha256:81c24e0c0fde47a9723c81d5806569cddef103aebbf79dbc9fcbb617153dea30",
"sha256:a2c69a7664fb2d54b8682dd774c3b54f67f84fa123cf84dda2a5f40dcaa04e08", "sha256:81eedafa609917040d39aa9332e25881a8e7a0862495fcdf2023a9667209deda",
"sha256:a3be4987e3ee9d9a380b66393b77a4cd6d742480c951a1c56a23c335caca4ce3", "sha256:81f413674d85cfd0dfcd6512e10e0f33c19c21860342a4890c3a2b59479929f9",
"sha256:a86b4240e67d4753dc3092d9511886795b3c2852abe599cffe108952f7af7ac3", "sha256:8280856dd7c6a68ab3a164b4a4b1c51f7691f6d04af4d4ca23d6ecf2261b7923",
"sha256:aa9373708763ef46782d10e950b49d0235bfe58facebd76917d3f5cbf5971aed", "sha256:82ca366a844eb551daff9d2e6e7a9e5e76d2612c8564f58db6c19a726869c1df",
"sha256:b64b183d610b424a160b0d4d880995e935208fc043d0302dd29fee32d1ee3f95", "sha256:8b4af17bda11e907c51d10686eda89049f9ce5669b08fbe71a29747f1e876036",
"sha256:b801154027107461ee992ff4b5c09aa7cc6ec91ddfe50d02bca344918c3265c6", "sha256:90144d3b0c8b139408da50196c5cad2a6909b51b23df1f0538411cd23ffa45d3",
"sha256:bb209a73b8307f8fe4fe46f6ad5979649be01607f11af1eb94aa9e8a3aaf77f0", "sha256:906e6b0d7d452e9a98e5ab8507c0da791856b2380fdee61b765632bb8698026f",
"sha256:bc8b7dabe8e67c4832891a5d322cec6d44ef02f432b4588390017f5cec186a84", "sha256:90c11ceb9a1f482c752a71f203a81858625d8df5746d787a4786bca4ffdf71c6",
"sha256:c51db269513917394faec5e5c00d6f83829742ba62e2ac4fa5c98d58be91662f", "sha256:911cc493ebd60de5f285bcae0491a60b4f2a9f0f5c270edd1c4dbaef7a38fc04",
"sha256:c55731c116806836a5d678a70c84cb13f2cedba920212ba7dcad53260997666d", "sha256:9a420a91913092d1e20c86a2f5f1fc85c1a8924dbcaf5e0586df8aceb09c9cc2",
"sha256:cf18ff7fc9941b8fc23437cc3e68ed4ebeff3599eec6ef5eebf305f3d2e9a7c2", "sha256:9f8c9fdd15a55d9465e590a402f42082705d66b05afc3ffd2d2eb3c6ba919560",
"sha256:d24f571990c05f6b36a396218f251f3e0dda916e0c687ef6fdca5072743208f5", "sha256:a104c5694dfd2d864a6f91b0956eb5d5883234119cb40010115fd45a16da5e70",
"sha256:db854730a25db7c956423bb9fb4bdd1216c839a689bf9cc15fada0a7fb2f4570", "sha256:a373a400f3e9bac95ba2a06372c4fd1412a7cee53c37fc6c05f829bf672b8769",
"sha256:dc55990143cbd853a5d038c05e79284baedf3e299661389654551bd02a6a68d7", "sha256:a62448526dd9ed3e3beedc93df9bb6b55a436ed1474db31a2af13b313a70a7e1",
"sha256:e607cdd99cbf9bb80391f54446b86e16eea6ad309361942bf88318bcd452363c", "sha256:a8808d5cf866c781150d36a3c8eb3adccfa41a8105d031bf27e92c251e3969d6",
"sha256:ecf6d4cda1f9f6cb0b45803a01ea7f034e2f1aed9475e883410812d9f9e3cfcf", "sha256:b1f09b6821406ea1f94053f346f28f8215e293344209129a9c0fcc3578598d7b",
"sha256:f2a159111a0f58fb034c93eeba211b4141137ec4b0a6e75789ab7a3ef3c7e7e3", "sha256:b2ac41acfc8d965fb0c464eb8f44995770239668956dc4cdf502d1b1ffe0d747",
"sha256:f37c0caf14b9e9b9e8f6dbc81bc56db06acb4363eba5a633167781a48ef036ed", "sha256:b46fa6eae1cd1c20e6e6f44e19984d438b6b2d8616d21d783d150df714f44078",
"sha256:f5693145220517b5f42393e07a6898acdfe820e136c98663b971906120549da5" "sha256:b50eab9994d64f4a823ff99a0ed28a6903224ddbe7fef56a6dd865eec9243440",
"sha256:bfc9064f6658a3d1cadeaa0ba07570b83ce6801a1314985bf98ec9b95d74e15f",
"sha256:c0b0e5e1b5d9f3586601048dd68f392dc0cc99a59bb5faf18aab057ce00d00b2",
"sha256:c153265408d18de4cc5ded1941dcd8315894572cddd3c58df5d5b5705b3fa28d",
"sha256:d4ae769b9c1c7757e4ccce94b0641bc203bbdf43ba7a2413ab2523d8d047d8dc",
"sha256:dc56c9788617b8964ad02e8fcfeed4001c1f8ba91a9e1f31483c0dffb207002a",
"sha256:dd5ec3aa6ae6e4d5b5de9357d2133c07be1aff6405b136dad753a16afb6717dd",
"sha256:edba70118c4be3c2b1f90754d308d0b79c6fe2c0fdc52d8ddf603916f83f4db9",
"sha256:ff8e80c4c4932c10493ff97028decfdb622de69cae87e0f127a7ebe32b4069c6"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.0.25" "markers": "python_version >= '3.7'",
"version": "==2.0.41"
}, },
"tenacity": { "tenacity": {
"hashes": [ "hashes": [
"sha256:5398ef0d78e63f40007c1fb4c0bff96e1911394d2fa8d194f77619c05ff6cc8a", "sha256:1169d376c297e7de388d18b4481760d478b0e99a777cad3a9c86e556f4b697cb",
"sha256:ce510e327a630c9e1beaf17d42e6ffacc88185044ad85cf74c0a8887c6a0f88c" "sha256:f77bf36710d8b73a50b2dd155c97b870017ad21afe6ab300326b0371b3b05138"
], ],
"index": "pypi", "index": "pypi",
"version": "==8.2.3" "markers": "python_version >= '3.9'",
"version": "==9.1.2"
}, },
"typing-extensions": { "typing-extensions": {
"hashes": [ "hashes": [
"sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783", "sha256:8676b788e32f02ab42d9e7c61324048ae4c6d844a399eebace3d4979d75ceef4",
"sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd" "sha256:a1514509136dd0b477638fc68d6a91497af5076466ad0fa6c338e44e359944af"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.9'",
"version": "==4.9.0" "version": "==4.14.0"
} }
}, },
"develop": { "develop": {
"autopep8": { "autopep8": {
"hashes": [ "hashes": [
"sha256:067959ca4a07b24dbd5345efa8325f5f58da4298dab0dde0443d5ed765de80cb", "sha256:8d6c87eba648fdcfc83e29b788910b8643171c395d9c4bcf115ece035b9c9dda",
"sha256:2913064abd97b3419d1cc83ea71f042cb821f87e45b9c88cad5ad3c4ea87fe0c" "sha256:a203fe0fcad7939987422140ab17a930f684763bf7335bdb6709991dd7ef6c2d"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.0.4" "markers": "python_version >= '3.8'",
"version": "==2.3.1"
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [
"sha256:33f96621059e65eec474169085dc92bf26e7b2d47366b70be2f67ab80dc25132", "sha256:049d058491e228e03e67b390f311bbf88fce2dbaa8fa673e7aea87b7198b8d38",
"sha256:a6dfbb75e03252917f2473ea9653f7cd799c3064e54d4c8140044c5c065f53c3" "sha256:597477df7860daa5aa0fdd84bf5208a043ab96b8e96ab708770ae0364dd03213"
], ],
"index": "pypi", "index": "pypi",
"version": "==7.0.0" "markers": "python_full_version >= '3.8.1'",
"version": "==7.1.1"
}, },
"mccabe": { "mccabe": {
"hashes": [ "hashes": [
@ -187,11 +136,11 @@
}, },
"pycodestyle": { "pycodestyle": {
"hashes": [ "hashes": [
"sha256:41ba0e7afc9752dfb53ced5489e89f8186be00e599e712660695b7a75ff2663f", "sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3",
"sha256:44fe31000b2d866f2e41841b18528a505fbd7fef9017b04eff4e2648a0fadc67" "sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521"
], ],
"markers": "python_version >= '3.8'", "markers": "python_version >= '3.8'",
"version": "==2.11.1" "version": "==2.12.1"
}, },
"pyflakes": { "pyflakes": {
"hashes": [ "hashes": [
@ -203,11 +152,11 @@
}, },
"tomli": { "tomli": {
"hashes": [ "hashes": [
"sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc", "sha256:2ebe24485c53d303f690b0ec092806a085f07af5a5aa1464f3931eec36caaa38",
"sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f" "sha256:d46d457a85337051c36524bc5349dd91b1877838e2979ac5ced3e710ed8a60ed"
], ],
"markers": "python_version < '3.11'", "markers": "python_version < '3.11'",
"version": "==2.0.1" "version": "==2.0.2"
} }
} }
} }

View File

@ -1,4 +1,4 @@
FROM python:3.9-bullseye FROM python:3.9
ENV TZ="Asia/Tokyo" ENV TZ="Asia/Tokyo"

View File

@ -19,11 +19,11 @@
"develop": { "develop": {
"autopep8": { "autopep8": {
"hashes": [ "hashes": [
"sha256:067959ca4a07b24dbd5345efa8325f5f58da4298dab0dde0443d5ed765de80cb", "sha256:1fa8964e4618929488f4ec36795c7ff12924a68b8bf01366c094fc52f770b6e7",
"sha256:2913064abd97b3419d1cc83ea71f042cb821f87e45b9c88cad5ad3c4ea87fe0c" "sha256:2bb76888c5edbcafe6aabab3c47ba534f5a2c2d245c2eddced4a30c4b4946357"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.0.4" "version": "==2.1.0"
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [

View File

@ -26,6 +26,7 @@ openpyxl = "*"
xlrd = "*" xlrd = "*"
sqlalchemy = "==2.*" sqlalchemy = "==2.*"
mojimoji = "*" mojimoji = "*"
numpy = "==2.0.*"
[dev-packages] [dev-packages]
autopep8 = "*" autopep8 = "*"

File diff suppressed because it is too large Load Diff

View File

@ -84,6 +84,8 @@
│   ├── exception_handler.py -- FastAPI内部でエラー発生時のハンドリング │   ├── exception_handler.py -- FastAPI内部でエラー発生時のハンドリング
│   └── exceptions.py -- カスタム例外クラス │   └── exceptions.py -- カスタム例外クラス
├── main.py -- APサーバーのエントリーポイント。ここでルーターやハンドラーの登録を行う ├── main.py -- APサーバーのエントリーポイント。ここでルーターやハンドラーの登録を行う
├── middleware -- ミドルウェアの設定
│ └── middleware.py
├── model -- モデル層(MVCのM) ├── model -- モデル層(MVCのM)
│   ├── db -- リポジトリから返されるDBレコードのモデル │   ├── db -- リポジトリから返されるDBレコードのモデル
│   │   ├── base_db_model.py │   │   ├── base_db_model.py
@ -195,3 +197,39 @@
- コントローラーのrouter変数が、`router.route_class = Authenticate`となっている場合、以下の動きをする - コントローラーのrouter変数が、`router.route_class = Authenticate`となっている場合、以下の動きをする
- リクエスト到達時にセッションの有無をチェックする - リクエスト到達時にセッションの有無をチェックする
- レスポンス時、クッキーにセッションキーを登録する - レスポンス時、クッキーにセッションキーを登録する
## HTMLで読み込んでいるスクリプトのSRIハッシュ値を生成・設定する方法
### サブリソース完全性 (Subresource Integrity, SRI) とは
CDN などから取得したリソースが意図せず改ざんされていないかをブラウザーが検証するセキュリティ機能です。 SRI を利用する際には、取得したリソースのハッシュ値と一致すべきハッシュ値を指定します。
詳細:<https://developer.mozilla.org/ja/docs/Web/Security/Subresource_Integrity>
実消化&アルトマークのWebアプリケーションでは、複数の外部スクリプトを読み込んで動作しているため、読み込むスクリプトを変更した場合は、
タグの属性値`integrity`に設定されているスクリプトのハッシュ値を更新する必要がある。
### SRI ハッシュ値の生成方法(サーバー内のスクリプトについて)
- サーバー内に保管されているスクリプトを更新した場合、Linux環境WSL2でも可で、以下のコマンドを実行し、ハッシュ値を生成する
```bash
cat <更新したスクリプトファイル名> | openssl dgst -sha384 -binary | openssl base64 -A
```
参考:<https://developer.mozilla.org/ja/docs/Web/Security/Subresource_Integrity#sri_%E3%83%8F%E3%83%83%E3%82%B7%E3%83%A5%E3%82%92%E7%94%9F%E6%88%90%E3%81%99%E3%82%8B%E3%83%84%E3%83%BC%E3%83%AB>
### SRI ハッシュ値の生成方法(外部サイトから読み込んでいるスクリプトについて)
- 外部サイトから読み込んでいるスクリプトを更新した場合、下記のMDNオンラインツールでハッシュ値を生成する
- [SRI Hash Generator](https://www.srihash.org/)
### SRI ハッシュ値の設定方法
- 更新したスクリプトを読み込んでいる箇所の`integrity`属性値を、生成したハッシュ値に置き換える
- 以下は設定のサンプル
```bash
<script src="https://リンク/スクリプト.js" integrity="sha384-生成したハッシュ" crossorigin="anonymous"></script>
```

View File

@ -79,11 +79,11 @@ def search_bio_data(
'data': data, 'data': data,
'count': bio_sales_lot_count 'count': bio_sales_lot_count
}) })
# クッキーも書き換え # クッキーも書き換え
json_response.set_cookie( json_response.set_cookie(
key='session', key='session',
value=session.session_key, value=session.session_key,
max_age=environment.SESSION_EXPIRE_MINUTE * 60, # cookieの有効期限は秒数指定なので、60秒をかける
secure=True, secure=True,
httponly=True httponly=True
) )
@ -153,10 +153,10 @@ async def download_bio_data(
'status': 'ok', 'status': 'ok',
'download_url': download_file_url 'download_url': download_file_url
}) })
json_response.set_cookie( json_response.set_cookie(
key='session', key='session',
value=session.session_key, value=session.session_key,
max_age=environment.SESSION_EXPIRE_MINUTE * 60, # cookieの有効期限は秒数指定なので、60秒をかける
secure=True, secure=True,
httponly=True httponly=True
) )

View File

@ -70,11 +70,22 @@ def login(
jwt_token = login_service.login(request.username, request.password) jwt_token = login_service.login(request.username, request.password)
except NotAuthorizeException as e: except NotAuthorizeException as e:
logger.info(f'ログイン失敗:{e}') logger.info(f'ログイン失敗:{e}')
# ログイン失敗回数をカウント
login_service.increase_login_failed_count(request.username)
# ログイン失敗回数を超過した場合はメッセージを変える
if login_service.is_login_failed_limit_exceeded(request.username):
login_service.on_login_fail_limit_exceeded(request.username)
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=constants.LOGOUT_REASON_LOGIN_FAILED_LIMIT_EXCEEDED)
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=constants.LOGOUT_REASON_LOGIN_ERROR) raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=constants.LOGOUT_REASON_LOGIN_ERROR)
except JWTTokenVerifyException as e: except JWTTokenVerifyException as e:
logger.info(f'ログイン失敗:{e}') logger.info(f'ログイン失敗:{e}')
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED) raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED)
# ログイン成功問わず、DBのログイン失敗回数が10回以上あれば、ログアウト画面にリダイレクトする
if login_service.is_login_failed_limit_exceeded(request.username):
logger.info(f'ログイン失敗回数が10回以上: {request.username}')
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=constants.LOGOUT_REASON_LOGIN_FAILED_LIMIT_EXCEEDED)
verified_token = jwt_token.verify_token() verified_token = jwt_token.verify_token()
# 普通の認証だと、`cognito:username`に入る。 # 普通の認証だと、`cognito:username`に入る。
user_id = verified_token.user_id user_id = verified_token.user_id
@ -113,6 +124,7 @@ def login(
status_code=status.HTTP_303_SEE_OTHER, status_code=status.HTTP_303_SEE_OTHER,
headers={'session_key': session_key} headers={'session_key': session_key}
) )
return response return response
@ -170,4 +182,5 @@ def sso_authorize(
status_code=status.HTTP_303_SEE_OTHER, status_code=status.HTTP_303_SEE_OTHER,
headers={'session_key': session_key} headers={'session_key': session_key}
) )
return response return response

View File

@ -8,6 +8,7 @@ from src.model.internal.session import UserSession
from src.model.view.logout_view_model import LogoutViewModel from src.model.view.logout_view_model import LogoutViewModel
from src.system_var import constants from src.system_var import constants
from src.templates import templates from src.templates import templates
from src.services import session_service
router = APIRouter() router = APIRouter()
@ -16,6 +17,7 @@ router = APIRouter()
######################### #########################
@router.get('/', response_class=HTMLResponse) @router.get('/', response_class=HTMLResponse)
def logout_view( def logout_view(
request: Request, request: Request,
@ -47,4 +49,9 @@ def logout_view(
) )
# クッキーを削除 # クッキーを削除
template_response.delete_cookie('session') template_response.delete_cookie('session')
# セッション削除
if session:
session_service.delete_session(session)
return template_response return template_response

View File

@ -270,6 +270,7 @@ def inst_emp_csv_download(
ta_cd=csv_download_form.ta_cd, ta_cd=csv_download_form.ta_cd,
inst_cd=csv_download_form.inst_cd, inst_cd=csv_download_form.inst_cd,
emp_cd=csv_download_form.emp_cd, emp_cd=csv_download_form.emp_cd,
emp_chg_type_cd=csv_download_form.emp_chg_type_cd,
apply_date_from=csv_download_form.apply_date_from, apply_date_from=csv_download_form.apply_date_from,
start_date_from=csv_download_form.start_date_from, start_date_from=csv_download_form.start_date_from,
start_date_to=csv_download_form.start_date_to, start_date_to=csv_download_form.start_date_to,

View File

@ -189,7 +189,7 @@ class DatabaseClient:
self.__session = None self.__session = None
def to_jst(self): def to_jst(self):
self.execute('SET time_zone = "+9:00"') self.execute('SET SESSION time_zone = "Asia/Tokyo"')
def __execute_with_transaction(self, query: str, parameters: dict): def __execute_with_transaction(self, query: str, parameters: dict):
# トランザクションを開始してクエリを実行する # トランザクションを開始してクエリを実行する

View File

@ -10,8 +10,9 @@ from src.controller import (bio, bio_api, healthcheck, login, logout,
from src.core import task from src.core import task
from src.error.exception_handler import http_exception_handler from src.error.exception_handler import http_exception_handler
from src.error.exceptions import UnexpectedException from src.error.exceptions import UnexpectedException
from src.middleware.middleware import SecurityHeadersMiddleware
app = FastAPI() app = FastAPI(openapi_url=None)
# 静的ファイルをマウント # 静的ファイルをマウント
app.mount('/static', StaticFiles(directory=path.dirname(static.__file__)), name='static') app.mount('/static', StaticFiles(directory=path.dirname(static.__file__)), name='static')
@ -42,5 +43,8 @@ app.add_exception_handler(status.HTTP_403_FORBIDDEN, http_exception_handler)
# サーバーエラーが発生した場合のハンドラー。HTTPExceptionではハンドリングできないため、個別に設定 # サーバーエラーが発生した場合のハンドラー。HTTPExceptionではハンドリングできないため、個別に設定
app.add_exception_handler(UnexpectedException, http_exception_handler) app.add_exception_handler(UnexpectedException, http_exception_handler)
# セキュリティヘッダー設定はミドルウェアで処理する
app.add_middleware(SecurityHeadersMiddleware)
# サーバー起動時のイベント # サーバー起動時のイベント
app.add_event_handler('startup', task.create_start_app_handler()) app.add_event_handler('startup', task.create_start_app_handler())

View File

@ -0,0 +1,16 @@
from fastapi import Request, Response, status
from fastapi.responses import JSONResponse
from starlette.middleware.base import BaseHTTPMiddleware
class SecurityHeadersMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request, call_next):
response = await call_next(request)
# X-Frame-Optionsヘッダー追加
response.headers['X-Frame-Options'] = 'DENY'
# X-Content-Type-Optionsヘッダー追加
response.headers['X-Content-Type-Options'] = 'nosniff'
# Strict-Transport-Securityヘッダー追加
response.headers['Strict-Transport-Security'] = 'max-age=31536000 includeSubDomains'
# Cache-Controlヘッダー追加
response.headers['Cache-Control'] = 'private'
return response

View File

@ -2,7 +2,7 @@ from datetime import datetime
from typing import Optional from typing import Optional
from src.model.db.base_db_model import BaseDBModel from src.model.db.base_db_model import BaseDBModel
from src.system_var import constants
class UserMasterModel(BaseDBModel): class UserMasterModel(BaseDBModel):
user_id: Optional[str] user_id: Optional[str]
@ -25,6 +25,8 @@ class UserMasterModel(BaseDBModel):
updater: Optional[str] updater: Optional[str]
update_date: Optional[datetime] update_date: Optional[datetime]
mntuser_flg: Optional[str] mntuser_flg: Optional[str]
mntuser_login_failed_cnt: Optional[int]
mntuser_last_login_failed_datetime: Optional[datetime]
def is_enable_user(self): def is_enable_user(self):
return self.enabled_flg == 'Y' return self.enabled_flg == 'Y'
@ -34,3 +36,6 @@ class UserMasterModel(BaseDBModel):
def is_groupware_user(self): def is_groupware_user(self):
return self.mntuser_flg == '0' or self.mntuser_flg is None return self.mntuser_flg == '0' or self.mntuser_flg is None
def is_login_failed_limit_exceeded(self):
return self.mntuser_login_failed_cnt >= constants.LOGIN_FAIL_LIMIT

View File

@ -1,16 +1,17 @@
import csv import csv
import json import json
from io import TextIOWrapper
from datetime import datetime
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
from datetime import datetime
from io import TextIOWrapper
from src.logging.get_logger import get_logger
from src.repositories.bu_master_cd_repository import BuMasterRepository
from src.repositories.emp_chg_inst_repository import EmpChgInstRepository
from src.repositories.emp_master_repository import EmpMasterRepository
from src.repositories.generic_kbn_mst_repository import GenericKbnMstRepository
from src.repositories.mst_inst_repository import MstInstRepository
from src.system_var import constants from src.system_var import constants
from src.util.string_util import is_not_empty from src.util.string_util import is_not_empty
from src.repositories.mst_inst_repository import MstInstRepository
from src.repositories.bu_master_cd_repository import BuMasterRepository
from src.repositories.emp_master_repository import EmpMasterRepository
from src.repositories.emp_chg_inst_repository import EmpChgInstRepository
from src.logging.get_logger import get_logger
logger = get_logger('マスターメンテ') logger = get_logger('マスターメンテ')
@ -24,6 +25,7 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
emp_master_repository: EmpMasterRepository emp_master_repository: EmpMasterRepository
bu_master_repository: BuMasterRepository bu_master_repository: BuMasterRepository
emp_chginst_repository: EmpChgInstRepository emp_chginst_repository: EmpChgInstRepository
generic_kbn_mst_repository: GenericKbnMstRepository
def __init__( def __init__(
self, self,
@ -33,7 +35,8 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository emp_chginst_repository: EmpChgInstRepository,
generic_kbn_mst_repository: GenericKbnMstRepository
): ):
self.csv_row = csv_row self.csv_row = csv_row
self.table_name = table_name self.table_name = table_name
@ -42,6 +45,7 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
self.emp_master_repository = emp_master_repository self.emp_master_repository = emp_master_repository
self.bu_master_repository = bu_master_repository self.bu_master_repository = bu_master_repository
self.emp_chginst_repository = emp_chginst_repository self.emp_chginst_repository = emp_chginst_repository
self.generic_kbn_mst_repository = generic_kbn_mst_repository
def validate(self) -> list[str]: def validate(self) -> list[str]:
""" """
@ -57,6 +61,10 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
error_list.extend(self.check_require()) error_list.extend(self.check_require())
# 施設コード存在チェック # 施設コード存在チェック
error_list.extend(self.check_inst_cd_exists()) error_list.extend(self.check_inst_cd_exists())
# 領域コード存在チェック
error_list.extend(self.check_ta_cd_exists())
# 担当者種別コード存在チェック
error_list.extend(self.check_emp_chg_type_cd_exists())
# MUID存在チェック # MUID存在チェック
error_list.extend(self.check_emp_cd_exists()) error_list.extend(self.check_emp_cd_exists())
# BuCd存在チェック # BuCd存在チェック
@ -79,7 +87,7 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
return error_list return error_list
def emp_chg_inst_count(self, start_date: str): def emp_chg_inst_count(self, start_date: str):
return self.emp_chginst_repository.fetch_count(self.inst_cd, self.ta_cd, start_date, self.table_name) return self.emp_chginst_repository.fetch_count(self.inst_cd, self.ta_cd, self.emp_chg_type_cd, start_date, self.table_name)
def is_exist_emp_cd(self, start_date: str) -> bool: def is_exist_emp_cd(self, start_date: str) -> bool:
if start_date is None or len(start_date) == 0: if start_date is None or len(start_date) == 0:
@ -91,12 +99,36 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
def is_exist_inst_cd(self) -> bool: def is_exist_inst_cd(self) -> bool:
return True if self.mst_inst_repository.fetch_count(self.inst_cd) > 0 else False return True if self.mst_inst_repository.fetch_count(self.inst_cd) > 0 else False
def is_exist_emp_chg_type_cd(self, start_date: str) -> bool:
if start_date is None or len(start_date) == 0:
return False
if self.generic_kbn_mst_repository.fetch_count('emp_chg_type_cd', self.emp_chg_type_cd, start_date) > 0:
return True
return False
def is_exist_ta_cd(self, start_date: str) -> bool:
if start_date is None or len(start_date) == 0:
return False
if self.generic_kbn_mst_repository.fetch_count('ta_cd', self.ta_cd, start_date) > 0:
return True
return False
def is_exist_bu_cd(self) -> bool: def is_exist_bu_cd(self) -> bool:
return True if self.bu_master_repository.fetch_count(self.bu_cd) > 0 else False return True if self.bu_master_repository.fetch_count(self.bu_cd) > 0 else False
def make_require_error_message(self, line_num: str, col_name: str) -> str: def make_require_error_message(self, line_num: str, col_name: str) -> str:
return f'{line_num}行目の{col_name}が入力されておりません。' return f'{line_num}行目の{col_name}が入力されておりません。'
def make_data_exist_error_message(self, line_num: str, primary_key_col_names: list[str]) -> str:
return self.__make_check_data_exists_error_message(line_num, primary_key_col_names, 'がすべて同一のデータが既に登録されています。')
def make_data_not_exist_error_message(self, line_num: str, primary_key_col_names: list[str]) -> str:
return self.__make_check_data_exists_error_message(line_num, primary_key_col_names, 'がすべて同一のデータが存在しないため更新できません。')
def __make_check_data_exists_error_message(self, line_num: str, primary_key_col_names: list[str], suffix_message: str) -> str:
primary_key_logical_names = ''.join(primary_key_col_names)
return f'{line_num}行目の{primary_key_logical_names}{suffix_message}'
def __parse_str_to_date(self, check_date: str) -> tuple[bool, datetime]: def __parse_str_to_date(self, check_date: str) -> tuple[bool, datetime]:
try: try:
check_date_time: datetime = datetime.strptime(check_date, '%Y%m%d') check_date_time: datetime = datetime.strptime(check_date, '%Y%m%d')
@ -160,6 +192,18 @@ class MasterMainteCSVItem(metaclass=ABCMeta):
pass pass
... ...
@abstractmethod
def check_emp_chg_type_cd_exists(self) -> list[str]:
"""担当者種別コード存在チェック"""
pass
...
@abstractmethod
def check_ta_cd_exists(self) -> list[str]:
"""領域コード存在チェック"""
pass
...
@abstractmethod @abstractmethod
def check_emp_cd_exists(self) -> list[str]: def check_emp_cd_exists(self) -> list[str]:
"""MUID存在チェック""" """MUID存在チェック"""
@ -205,7 +249,8 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository emp_chginst_repository: EmpChgInstRepository,
generic_kbn_mst_repository: GenericKbnMstRepository
): ):
super().__init__( super().__init__(
csv_row, csv_row,
@ -214,11 +259,13 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository emp_chginst_repository,
generic_kbn_mst_repository
) )
self.inst_cd = super().get_csv_value(constants.CSV_NEW_INST_CD_COL_NO) self.inst_cd = super().get_csv_value(constants.CSV_NEW_INST_CD_COL_NO)
self.inst_name = super().get_csv_value(constants.CSV_NEW_INST_NAME_COL_NO) self.inst_name = super().get_csv_value(constants.CSV_NEW_INST_NAME_COL_NO)
self.ta_cd = super().get_csv_value(constants.CSV_NEW_TA_CD_COL_NO) self.ta_cd = super().get_csv_value(constants.CSV_NEW_TA_CD_COL_NO)
self.emp_chg_type_cd = super().get_csv_value(constants.CSV_NEW_EMP_CHG_TYPE_CD_COL_NO)
self.emp_cd = super().get_csv_value(constants.CSV_NEW_EMP_CD_COL_NO) self.emp_cd = super().get_csv_value(constants.CSV_NEW_EMP_CD_COL_NO)
self.emp_name_family = super().get_csv_value(constants.CSV_NEW_EMP_NAME_FAMILY_COL_NO) self.emp_name_family = super().get_csv_value(constants.CSV_NEW_EMP_NAME_FAMILY_COL_NO)
self.emp_name_first = super().get_csv_value(constants.CSV_NEW_EMP_NAME_FIRST_COL_NO) self.emp_name_first = super().get_csv_value(constants.CSV_NEW_EMP_NAME_FIRST_COL_NO)
@ -237,6 +284,9 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
if len(self.ta_cd) == 0: if len(self.ta_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_TA_CD_COL_NO])) self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_TA_CD_COL_NO]))
if len(self.emp_chg_type_cd) == 0:
error_list.append(self.make_require_error_message(
self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CHG_TYPE_CD_COL_NO]))
if len(self.emp_cd) == 0: if len(self.emp_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CD_COL_NO])) self.line_num, constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CD_COL_NO]))
@ -271,6 +321,26 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
は従業員マスタに存在しない もしくは 適用期間外のIDです') は従業員マスタに存在しない もしくは 適用期間外のIDです')
return error_list return error_list
def check_emp_chg_type_cd_exists(self) -> list[str]:
error_list = []
if not self.start_date or not self.emp_chg_type_cd:
return error_list
if is_not_empty(self.emp_chg_type_cd) and super().is_exist_emp_chg_type_cd(self.start_date) is False:
error_list.append(f'{self.line_num}行目の{constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CHG_TYPE_CD_COL_NO]}\
は汎用区分マスタに存在しない もしくは 適用期間外のコードです')
return error_list
def check_ta_cd_exists(self) -> list[str]:
error_list = []
if not self.start_date or not self.ta_cd:
return error_list
if is_not_empty(self.ta_cd) and super().is_exist_ta_cd(self.start_date) is False:
error_list.append(f'{self.line_num}行目の{constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_TA_CD_COL_NO]}\
は汎用区分マスタに存在しない もしくは 適用期間外のコードです')
return error_list
def check_bu_cd_exists(self) -> list[str]: def check_bu_cd_exists(self) -> list[str]:
error_list = [] error_list = []
@ -303,7 +373,15 @@ class MasterMainteNewInstEmpCSVItem(MasterMainteCSVItem):
def check_data_exists(self) -> list[str]: def check_data_exists(self) -> list[str]:
error_list = [] error_list = []
if super().emp_chg_inst_count(self.start_date) > 0: if super().emp_chg_inst_count(self.start_date) > 0:
error_list.append(f'{self.line_num}行目の施設コード、領域コード、適用開始日がすべて同一のデータが既に登録されています。') error_list.append(super().make_data_exist_error_message(
self.line_num,
primary_key_col_names=[
constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_INST_CD_COL_NO],
constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_TA_CD_COL_NO],
constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_EMP_CHG_TYPE_CD_COL_NO],
constants.NEW_INST_EMP_CSV_LOGICAL_NAMES[constants.CSV_NEW_START_DATE]
]
))
return error_list return error_list
@ -329,7 +407,8 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository emp_chginst_repository: EmpChgInstRepository,
generic_kbn_mst_repository: GenericKbnMstRepository
): ):
super().__init__( super().__init__(
csv_row, csv_row,
@ -338,7 +417,8 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository emp_chginst_repository,
generic_kbn_mst_repository
) )
self.bu_cd = super().get_csv_value(constants.CSV_CHANGE_BU_CD_COL_NO) self.bu_cd = super().get_csv_value(constants.CSV_CHANGE_BU_CD_COL_NO)
self.bu_name = super().get_csv_value(constants.CSV_CHANGE_BU_NAME_COL_NO) self.bu_name = super().get_csv_value(constants.CSV_CHANGE_BU_NAME_COL_NO)
@ -348,6 +428,7 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
self.inst_name = super().get_csv_value(constants.CSV_CHANGE_INST_NAME_COL_NO) self.inst_name = super().get_csv_value(constants.CSV_CHANGE_INST_NAME_COL_NO)
self.ta_cd = super().get_csv_value(constants.CSV_CHANGE_TA_CD_COL_NO) self.ta_cd = super().get_csv_value(constants.CSV_CHANGE_TA_CD_COL_NO)
self.explain = super().get_csv_value(constants.CSV_CHANGE_EXPLAIN_COL_NO) self.explain = super().get_csv_value(constants.CSV_CHANGE_EXPLAIN_COL_NO)
self.emp_chg_type_cd = super().get_csv_value(constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO)
self.emp_cd = super().get_csv_value(constants.CSV_CHANGE_EMP_CD_COL_NO) self.emp_cd = super().get_csv_value(constants.CSV_CHANGE_EMP_CD_COL_NO)
self.emp_full_name = super().get_csv_value(constants.CSV_CHANGE_EMP_FULL_NAME_COL_NO) self.emp_full_name = super().get_csv_value(constants.CSV_CHANGE_EMP_FULL_NAME_COL_NO)
self.inst_emp_start_date = super().get_csv_value(constants.CSV_CHANGE_INST_EMP_START_DATE_COL_NO) self.inst_emp_start_date = super().get_csv_value(constants.CSV_CHANGE_INST_EMP_START_DATE_COL_NO)
@ -370,6 +451,9 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
if len(self.ta_cd) == 0: if len(self.ta_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO]))
if len(self.emp_chg_type_cd) == 0:
error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO]))
if len(self.emp_cd) == 0: if len(self.emp_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CD_COL_NO]))
@ -388,6 +472,9 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
if len(self.ta_cd) == 0: if len(self.ta_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO]))
if len(self.emp_chg_type_cd) == 0:
error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO]))
if len(self.inst_emp_start_date) == 0: if len(self.inst_emp_start_date) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, self.line_num,
@ -403,6 +490,10 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
if len(self.ta_cd) == 0: if len(self.ta_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO]))
if len(self.emp_chg_type_cd) == 0:
error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO]))
if len(self.emp_cd) == 0: if len(self.emp_cd) == 0:
error_list.append(self.make_require_error_message( error_list.append(self.make_require_error_message(
self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CD_COL_NO])) self.line_num, constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CD_COL_NO]))
@ -435,6 +526,28 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
は従業員マスタに存在しない もしくは 適用期間外のIDです') は従業員マスタに存在しない もしくは 適用期間外のIDです')
return error_list return error_list
def check_emp_chg_type_cd_exists(self) -> list[str]:
error_list = []
if not self.inst_emp_start_date or not self.emp_chg_type_cd:
return error_list
if is_not_empty(self.emp_chg_type_cd) and super().is_exist_emp_chg_type_cd(self.inst_emp_start_date) is False:
error_list.append(f'{self.line_num}行目の{constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO]}\
は汎用区分マスタに存在しない もしくは 適用期間外のコードです')
return error_list
def check_ta_cd_exists(self) -> list[str]:
error_list = []
if not self.inst_emp_start_date or not self.ta_cd:
return error_list
if is_not_empty(self.ta_cd) and super().is_exist_ta_cd(self.inst_emp_start_date) is False:
error_list.append(f'{self.line_num}行目の{constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO]}\
は汎用区分マスタに存在しない もしくは 適用期間外のコードです')
return error_list
def check_bu_cd_exists(self) -> list[str]: def check_bu_cd_exists(self) -> list[str]:
error_list = [] error_list = []
@ -484,10 +597,26 @@ class MasterMainteChangeInstEmpCSVItem(MasterMainteCSVItem):
error_list = [] error_list = []
emp_chg_inst_count = super().emp_chg_inst_count(self.inst_emp_start_date) emp_chg_inst_count = super().emp_chg_inst_count(self.inst_emp_start_date)
if self.comment == '追加' and emp_chg_inst_count > 0: if self.comment == '追加' and emp_chg_inst_count > 0:
error_list.append(f'{self.line_num}行目の施設コード、領域コード、施設担当_開始日がすべて同一のデータが既に登録されています。') error_list.append(super().make_data_exist_error_message(
self.line_num,
primary_key_col_names=[
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_INST_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_INST_EMP_START_DATE_COL_NO]
]
))
elif (self.comment == '終了' or self.comment == '担当者修正') and emp_chg_inst_count == 0: elif (self.comment == '終了' or self.comment == '担当者修正') and emp_chg_inst_count == 0:
error_list.append(f'{self.line_num}行目の施設コード、領域コード、施設担当_開始日がすべて同一のデータが存在しないため更新できません。') error_list.append(super().make_data_not_exist_error_message(
self.line_num,
primary_key_col_names=[
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_INST_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_TA_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO],
constants.CHANGE_INST_CSV_LOGICAL_NAMES[constants.CSV_CHANGE_INST_EMP_START_DATE_COL_NO]
]
))
return error_list return error_list
@ -525,7 +654,8 @@ class MasterMainteCSVItems:
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository emp_chginst_repository: EmpChgInstRepository,
generic_kbn_mst_repository: GenericKbnMstRepository
) -> None: ) -> None:
reader = csv.reader(file) reader = csv.reader(file)
csv_rows = [] csv_rows = []
@ -540,7 +670,9 @@ class MasterMainteCSVItems:
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository)) emp_chginst_repository,
generic_kbn_mst_repository
))
self.lines = csv_rows self.lines = csv_rows
def __select_function(self, def __select_function(self,
@ -551,7 +683,8 @@ class MasterMainteCSVItems:
mst_inst_repository: MstInstRepository, mst_inst_repository: MstInstRepository,
emp_master_repository: EmpMasterRepository, emp_master_repository: EmpMasterRepository,
bu_master_repository: BuMasterRepository, bu_master_repository: BuMasterRepository,
emp_chginst_repository: EmpChgInstRepository) -> MasterMainteCSVItem: emp_chginst_repository: EmpChgInstRepository,
generic_kbn_mst_repository: GenericKbnMstRepository) -> MasterMainteCSVItem:
if function_type == 'new': if function_type == 'new':
return MasterMainteNewInstEmpCSVItem( return MasterMainteNewInstEmpCSVItem(
row, row,
@ -560,7 +693,8 @@ class MasterMainteCSVItems:
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository) emp_chginst_repository,
generic_kbn_mst_repository)
elif function_type == 'change': elif function_type == 'change':
return MasterMainteChangeInstEmpCSVItem( return MasterMainteChangeInstEmpCSVItem(
row, row,
@ -569,4 +703,5 @@ class MasterMainteCSVItems:
mst_inst_repository, mst_inst_repository,
emp_master_repository, emp_master_repository,
bu_master_repository, bu_master_repository,
emp_chginst_repository) emp_chginst_repository,
generic_kbn_mst_repository)

View File

@ -29,8 +29,8 @@ class MasterMainteEmpChgInstFunction(metaclass=ABCMeta):
def save(self): def save(self):
error_list = [] error_list = []
try: try:
self.emp_chginst_repository.to_jst()
self.emp_chginst_repository.begin() self.emp_chginst_repository.begin()
self.emp_chginst_repository.to_jst()
(result_message, error_list) = self.write_emp_chg_inst_table() (result_message, error_list) = self.write_emp_chg_inst_table()
if len(error_list) > 0: if len(error_list) > 0:
self.emp_chginst_repository.rollback() self.emp_chginst_repository.rollback()
@ -46,6 +46,7 @@ class MasterMainteEmpChgInstFunction(metaclass=ABCMeta):
self.emp_chginst_repository.insert_emp_chg_inst( self.emp_chginst_repository.insert_emp_chg_inst(
data['施設コード'], data['施設コード'],
data['領域コード'], data['領域コード'],
data['担当者種別コード'],
data['MUID'], data['MUID'],
data['ビジネスユニットコード'], data['ビジネスユニットコード'],
start_date, start_date,
@ -148,6 +149,7 @@ class ChangeEmpChgInstFunction(MasterMainteEmpChgInstFunction):
self.emp_chginst_repository.end_emp_chg_inst( self.emp_chginst_repository.end_emp_chg_inst(
data['施設コード'], data['施設コード'],
data['領域コード'], data['領域コード'],
data['担当者種別コード'],
data['施設担当_開始日'], data['施設担当_開始日'],
data['終了日の変更'], data['終了日の変更'],
self.user_name, self.user_name,
@ -158,6 +160,7 @@ class ChangeEmpChgInstFunction(MasterMainteEmpChgInstFunction):
data['施設コード'], data['施設コード'],
data['領域コード'], data['領域コード'],
data['施設担当_開始日'], data['施設担当_開始日'],
data['担当者種別コード'],
data['MUID'], data['MUID'],
self.user_name, self.user_name,
self.table_name) self.table_name)

View File

@ -2,20 +2,22 @@ from typing import Optional
from fastapi import Form from fastapi import Form
from src.util.sanitize import sanitize
from src.model.request.request_base_model import RequestBaseModel from src.model.request.request_base_model import RequestBaseModel
from src.util.sanitize import sanitize
from src.util.string_util import is_not_empty from src.util.string_util import is_not_empty
@sanitize @sanitize
class MasterMainteCsvDlModel(RequestBaseModel): class MasterMainteCsvDlModel(RequestBaseModel):
# adaptは検索に使用する値
ta_cd: Optional[str] ta_cd: Optional[str]
adapt_ta_cd: Optional[str] adapt_ta_cd: Optional[str]
inst_cd: Optional[str] inst_cd: Optional[str]
adapt_inst_cd: Optional[str] adapt_inst_cd: Optional[str]
emp_cd: Optional[str] emp_cd: Optional[str]
adapt_emp_cd: Optional[str] adapt_emp_cd: Optional[str]
emp_chg_type_cd: Optional[str]
adapt_emp_chg_type_cd: Optional[str]
apply_date_from: Optional[str] apply_date_from: Optional[str]
adapt_apply_date_from: Optional[str] adapt_apply_date_from: Optional[str]
start_date_from: Optional[str] start_date_from: Optional[str]
@ -42,6 +44,7 @@ class MasterMainteCsvDlModel(RequestBaseModel):
ctrl_ta_cd: Optional[str] = Form(None), ctrl_ta_cd: Optional[str] = Form(None),
ctrl_inst_cd: Optional[str] = Form(None), ctrl_inst_cd: Optional[str] = Form(None),
ctrl_emp_cd: Optional[str] = Form(None), ctrl_emp_cd: Optional[str] = Form(None),
ctrl_emp_chg_type_cd: Optional[str] = Form(None),
ctrl_apply_date_from: Optional[str] = Form(None), ctrl_apply_date_from: Optional[str] = Form(None),
ctrl_start_date_from: Optional[str] = Form(None), ctrl_start_date_from: Optional[str] = Form(None),
ctrl_start_date_to: Optional[str] = Form(None), ctrl_start_date_to: Optional[str] = Form(None),
@ -58,6 +61,7 @@ class MasterMainteCsvDlModel(RequestBaseModel):
ctrl_ta_cd, ctrl_ta_cd,
ctrl_inst_cd, ctrl_inst_cd,
ctrl_emp_cd, ctrl_emp_cd,
ctrl_emp_chg_type_cd,
ctrl_apply_date_from, ctrl_apply_date_from,
ctrl_start_date_from, ctrl_start_date_from,
ctrl_start_date_to, ctrl_start_date_to,
@ -75,6 +79,7 @@ class MasterMainteCsvDlModel(RequestBaseModel):
ctrl_ta_cd: str, ctrl_ta_cd: str,
ctrl_inst_cd: str, ctrl_inst_cd: str,
ctrl_emp_cd: str, ctrl_emp_cd: str,
ctrl_emp_chg_type_cd,
ctrl_apply_date_from: str, ctrl_apply_date_from: str,
ctrl_start_date_from: str, ctrl_start_date_from: str,
ctrl_start_date_to: str, ctrl_start_date_to: str,
@ -89,6 +94,7 @@ class MasterMainteCsvDlModel(RequestBaseModel):
ctrl_ta_cd = ctrl_ta_cd if is_not_empty(ctrl_ta_cd) else '' ctrl_ta_cd = ctrl_ta_cd if is_not_empty(ctrl_ta_cd) else ''
ctrl_inst_cd = ctrl_inst_cd if is_not_empty(ctrl_inst_cd) else '' ctrl_inst_cd = ctrl_inst_cd if is_not_empty(ctrl_inst_cd) else ''
ctrl_emp_cd = ctrl_emp_cd if is_not_empty(ctrl_emp_cd) else '' ctrl_emp_cd = ctrl_emp_cd if is_not_empty(ctrl_emp_cd) else ''
ctrl_emp_chg_type_cd = ctrl_emp_chg_type_cd if is_not_empty(ctrl_emp_chg_type_cd) else ''
adapt_apply_date_from = '' adapt_apply_date_from = ''
if is_not_empty(ctrl_apply_date_from): if is_not_empty(ctrl_apply_date_from):
@ -147,6 +153,8 @@ class MasterMainteCsvDlModel(RequestBaseModel):
adapt_inst_cd=ctrl_inst_cd, adapt_inst_cd=ctrl_inst_cd,
emp_cd=ctrl_emp_cd, emp_cd=ctrl_emp_cd,
adapt_emp_cd=ctrl_emp_cd, adapt_emp_cd=ctrl_emp_cd,
emp_chg_type_cd=ctrl_emp_chg_type_cd,
adapt_emp_chg_type_cd=ctrl_emp_chg_type_cd,
apply_date_from=ctrl_apply_date_from, apply_date_from=ctrl_apply_date_from,
adapt_apply_date_from=adapt_apply_date_from, adapt_apply_date_from=adapt_apply_date_from,
start_date_from=ctrl_start_date_from, start_date_from=ctrl_start_date_from,

View File

@ -9,6 +9,7 @@ class InstEmpCsvDownloadViewModel(BaseModel):
ta_cd: str = '' ta_cd: str = ''
inst_cd: str = '' inst_cd: str = ''
emp_cd: str = '' emp_cd: str = ''
emp_chg_type_cd: str = ''
apply_date_from: str = '' apply_date_from: str = ''
start_date_from: str = '' start_date_from: str = ''
start_date_to: str = '' start_date_to: str = ''

View File

@ -1,10 +1,10 @@
from src.repositories.base_repository import BaseRepository
from src.db.sql_condition import SQLCondition
from src.db import sql_condition as condition from src.db import sql_condition as condition
from src.db.sql_condition import SQLCondition
from src.logging.get_logger import get_logger
from src.model.db.master_mente_count import MasterMenteCountModel from src.model.db.master_mente_count import MasterMenteCountModel
from src.model.request.master_mainte_csvdl import MasterMainteCsvDlModel from src.model.request.master_mainte_csvdl import MasterMainteCsvDlModel
from src.repositories.base_repository import BaseRepository
from src.util.string_util import is_not_empty from src.util.string_util import is_not_empty
from src.logging.get_logger import get_logger
logger = get_logger('従業員担当施設マスタ') logger = get_logger('従業員担当施設マスタ')
@ -28,6 +28,7 @@ class EmpChgInstRepository(BaseRepository):
( (
inst_cd, inst_cd,
ta_cd, ta_cd,
emp_chg_type_cd,
emp_cd, emp_cd,
bu_cd, bu_cd,
start_date, start_date,
@ -42,6 +43,7 @@ class EmpChgInstRepository(BaseRepository):
VALUES ( VALUES (
:inst_cd, :inst_cd,
:ta_cd, :ta_cd,
:emp_chg_type_cd,
:emp_cd, :emp_cd,
:bu_cd, :bu_cd,
:start_date, :start_date,
@ -55,13 +57,14 @@ class EmpChgInstRepository(BaseRepository):
) )
""" """
def insert_emp_chg_inst(self, inst_cd, ta_cd, emp_cd, bu_cd, start_date, def insert_emp_chg_inst(self, inst_cd, ta_cd, emp_chg_type_cd, emp_cd, bu_cd, start_date,
end_date, create_user_name, table_name): end_date, create_user_name, table_name):
try: try:
query = self.INSERT_SQL.format(table_name=table_name) query = self.INSERT_SQL.format(table_name=table_name)
self._database.execute(query, { self._database.execute(query, {
'inst_cd': inst_cd, 'inst_cd': inst_cd,
'ta_cd': ta_cd, 'ta_cd': ta_cd,
'emp_chg_type_cd': emp_chg_type_cd,
'emp_cd': emp_cd, 'emp_cd': emp_cd,
'bu_cd': bu_cd, 'bu_cd': bu_cd,
'start_date': start_date, 'start_date': start_date,
@ -82,17 +85,19 @@ class EmpChgInstRepository(BaseRepository):
update_date = NOW() update_date = NOW()
WHERE WHERE
inst_cd = :inst_cd inst_cd = :inst_cd
and ta_cd = :ta_cd AND ta_cd = :ta_cd
and start_date = :start_date AND emp_chg_type_cd = :emp_chg_type_cd
AND start_date = :start_date
""" """
def end_emp_chg_inst(self, inst_cd, ta_cd, start_date, def end_emp_chg_inst(self, inst_cd, ta_cd, emp_chg_type_cd, start_date,
end_date, update_user_name, table_name): end_date, update_user_name, table_name):
try: try:
query = self.UPDATE_END_DATE_SQL.format(table_name=table_name) query = self.UPDATE_END_DATE_SQL.format(table_name=table_name)
self._database.execute(query, { self._database.execute(query, {
'inst_cd': inst_cd, 'inst_cd': inst_cd,
'ta_cd': ta_cd, 'ta_cd': ta_cd,
'emp_chg_type_cd': emp_chg_type_cd,
'start_date': start_date, 'start_date': start_date,
'end_date': end_date, 'end_date': end_date,
'update_user_name': update_user_name 'update_user_name': update_user_name
@ -110,16 +115,18 @@ class EmpChgInstRepository(BaseRepository):
update_date = NOW() update_date = NOW()
where where
inst_cd = :inst_cd inst_cd = :inst_cd
and ta_cd = :ta_cd AND ta_cd = :ta_cd
and start_date = :start_date AND emp_chg_type_cd = :emp_chg_type_cd
AND start_date = :start_date
""" """
def modify_emp_chg_inst(self, inst_cd, ta_cd, start_date, emp_cd, update_user_name, table_name): def modify_emp_chg_inst(self, inst_cd, ta_cd, start_date, emp_chg_type_cd, emp_cd, update_user_name, table_name):
try: try:
query = self.UPDATE_EMP_CD_SQL.format(table_name=table_name) query = self.UPDATE_EMP_CD_SQL.format(table_name=table_name)
self._database.execute(query, { self._database.execute(query, {
'inst_cd': inst_cd, 'inst_cd': inst_cd,
'ta_cd': ta_cd, 'ta_cd': ta_cd,
'emp_chg_type_cd': emp_chg_type_cd,
'start_date': start_date, 'start_date': start_date,
'emp_cd': emp_cd, 'emp_cd': emp_cd,
'update_user_name': update_user_name 'update_user_name': update_user_name
@ -136,14 +143,15 @@ class EmpChgInstRepository(BaseRepository):
WHERE WHERE
inst_cd = :inst_cd inst_cd = :inst_cd
AND ta_cd = :ta_cd AND ta_cd = :ta_cd
AND emp_chg_type_cd = :emp_chg_type_cd
AND start_date = :start_date AND start_date = :start_date
""" """
def fetch_count(self, inst_cd, ta_cd, start_date, table_name) -> MasterMenteCountModel: def fetch_count(self, inst_cd, ta_cd, emp_chg_type_cd, start_date, table_name) -> MasterMenteCountModel:
try: try:
query = self.FETCH_COUNT_SQL.format(table_name=table_name) query = self.FETCH_COUNT_SQL.format(table_name=table_name)
result = self._database.execute_select(query, {'inst_cd': inst_cd, 'ta_cd': ta_cd, result = self._database.execute_select(query, {'inst_cd': inst_cd, 'ta_cd': ta_cd,
'start_date': start_date}) 'emp_chg_type_cd': emp_chg_type_cd, 'start_date': start_date})
models = [MasterMenteCountModel(**r) for r in result] models = [MasterMenteCountModel(**r) for r in result]
if len(models) == 0: if len(models) == 0:
return 0 return 0
@ -157,6 +165,7 @@ class EmpChgInstRepository(BaseRepository):
eci.inst_cd AS inst_cd, eci.inst_cd AS inst_cd,
mi.inst_name AS inst_name, mi.inst_name AS inst_name,
eci.ta_cd AS ta_cd, eci.ta_cd AS ta_cd,
eci.emp_chg_type_cd AS emp_chg_type_cd,
eci.emp_cd AS emp_cd, eci.emp_cd AS emp_cd,
CONCAT(emp.emp_name_family, " ", emp.emp_name_first) AS emp_name_full, CONCAT(emp.emp_name_family, " ", emp.emp_name_first) AS emp_name_full,
eci.bu_cd AS bu_cd, eci.bu_cd AS bu_cd,
@ -212,6 +221,11 @@ class EmpChgInstRepository(BaseRepository):
parameter.adapt_emp_cd = f'%{parameter.emp_cd}%' parameter.adapt_emp_cd = f'%{parameter.emp_cd}%'
where_clauses.append(SQLCondition('eci.emp_cd', condition.LIKE, 'adapt_emp_cd')) where_clauses.append(SQLCondition('eci.emp_cd', condition.LIKE, 'adapt_emp_cd'))
# 担当者種別コードが入力されていた場合
if is_not_empty(parameter.emp_chg_type_cd):
parameter.adapt_emp_chg_type_cd = f'%{parameter.emp_chg_type_cd}%'
where_clauses.append(SQLCondition('eci.emp_chg_type_cd', condition.LIKE, 'adapt_emp_chg_type_cd'))
# 適用期間内が入力されていた場合 # 適用期間内が入力されていた場合
if is_not_empty(parameter.adapt_apply_date_from): if is_not_empty(parameter.adapt_apply_date_from):
where_clauses.append(SQLCondition('eci.start_date', condition.LE, 'adapt_apply_date_from')) where_clauses.append(SQLCondition('eci.start_date', condition.LE, 'adapt_apply_date_from'))

View File

@ -0,0 +1,33 @@
from src.repositories.base_repository import BaseRepository
from src.model.db.master_mente_count import MasterMenteCountModel
from src.logging.get_logger import get_logger
logger = get_logger('汎用区分マスタ')
class GenericKbnMstRepository(BaseRepository):
FETCH_SQL = """\
SELECT
COUNT(*) AS count
FROM
src05.generic_kbn_mst
WHERE
generic_kbn_mst.generic_kbn_cd = :generic_kbn_cd
AND
generic_kbn_mst.kbn_cd = :kbn_cd
AND
STR_TO_DATE( :start_date , '%Y%m%d') BETWEEN generic_kbn_mst.start_date AND generic_kbn_mst.end_date\
"""
def fetch_count(self, generic_kbn_cd: str, kbn_cd: str, start_date: str) -> MasterMenteCountModel:
try:
query = self.FETCH_SQL
result = self._database.execute_select(query, {'generic_kbn_cd': generic_kbn_cd, 'kbn_cd': kbn_cd, 'start_date' : start_date})
models = [MasterMenteCountModel(**r) for r in result]
if len(models) == 0:
return 0
return models[0].count
except Exception as e:
logger.error(f"DB Error : Exception={e.args}")
raise e

View File

@ -6,6 +6,19 @@ logger = get_logger('ユーザー取得')
class UserMasterRepository(BaseRepository): class UserMasterRepository(BaseRepository):
def to_jst(self):
self._database.to_jst()
def begin(self):
self._database.begin()
def commit(self):
self._database.commit()
def rollback(self):
self._database.rollback()
FETCH_SQL = """\ FETCH_SQL = """\
SELECT SELECT
* *
@ -26,3 +39,46 @@ class UserMasterRepository(BaseRepository):
except Exception as e: except Exception as e:
logger.exception(f"DB Error : Exception={e}") logger.exception(f"DB Error : Exception={e}")
raise e raise e
def increase_login_failed_count(self, parameter: dict) -> UserMasterModel:
try:
query = """\
UPDATE
src05.user_mst
SET
mntuser_login_failed_cnt =
CASE
WHEN
DATE(mntuser_last_login_failed_datetime) = DATE(CURRENT_TIMESTAMP())
THEN
mntuser_login_failed_cnt + 1
ELSE
1
END,
mntuser_last_login_failed_datetime = CURRENT_TIMESTAMP()
WHERE
user_id = :user_id
AND
mntuser_flg = 1;\
"""
self._database.execute(query, parameter)
except Exception as e:
logger.exception(f"DB Error : Exception={e}")
raise e
def disable_mnt_user(self, parameter: dict) -> UserMasterModel:
try:
query = """\
UPDATE
src05.user_mst
SET
enabled_flg = 'N'
WHERE
user_id = :user_id
AND
mntuser_flg = 1\
"""
self._database.execute(query, parameter)
except Exception as e:
logger.exception(f"DB Error : Exception={e}")
raise e

View File

@ -103,6 +103,7 @@ class AfterSetCookieSessionRoute(MeDaCaRoute):
"""事後処理として、セッションキーをcookieに設定するカスタムルートハンドラー""" """事後処理として、セッションキーをcookieに設定するカスタムルートハンドラー"""
async def post_process_route(self, request: Request, response: Response): async def post_process_route(self, request: Request, response: Response):
response = await super().post_process_route(request, response) response = await super().post_process_route(request, response)
session_key = response.headers.get('session_key', None) session_key = response.headers.get('session_key', None)
# セッションキーがない場合はセットせずに返す # セッションキーがない場合はセットせずに返す
if session_key is None: if session_key is None:
@ -123,7 +124,6 @@ class AfterSetCookieSessionRoute(MeDaCaRoute):
response.set_cookie( response.set_cookie(
key='session', key='session',
value=session_key, value=session_key,
max_age=environment.SESSION_EXPIRE_MINUTE * 60, # cookieの有効期限は秒数指定なので、60秒をかける
secure=True, secure=True,
httponly=True httponly=True
) )

View File

@ -49,6 +49,27 @@ class LoginService(BaseService):
user_record: UserMasterModel = self.user_repository.fetch_one({'user_id': user_id}) user_record: UserMasterModel = self.user_repository.fetch_one({'user_id': user_id})
return user_record return user_record
def increase_login_failed_count(self, user_id: str):
try:
# セッション内のタイムゾーン変更のため、明示的にトランザクションを開始する
self.user_repository.begin()
self.user_repository.to_jst()
self.user_repository.increase_login_failed_count({'user_id': user_id})
self.user_repository.commit()
except Exception as e:
self.user_repository.rollback()
raise e
def on_login_fail_limit_exceeded(self, user_id: str):
self.user_repository.disable_mnt_user({'user_id': user_id})
def is_login_failed_limit_exceeded(self, user_id: str):
user_record: UserMasterModel = self.user_repository.fetch_one({'user_id': user_id})
if user_record is None:
return False
return user_record.is_login_failed_limit_exceeded()
def __secret_hash(self, username: str): def __secret_hash(self, username: str):
# see - https://aws.amazon.com/jp/premiumsupport/knowledge-center/cognito-unable-to-verify-secret-hash/ # noqa # see - https://aws.amazon.com/jp/premiumsupport/knowledge-center/cognito-unable-to-verify-secret-hash/ # noqa
message = bytes(username + environment.COGNITO_CLIENT_ID, 'utf-8') message = bytes(username + environment.COGNITO_CLIENT_ID, 'utf-8')

View File

@ -19,6 +19,7 @@ from src.repositories.mst_inst_repository import MstInstRepository
from src.repositories.bu_master_cd_repository import BuMasterRepository from src.repositories.bu_master_cd_repository import BuMasterRepository
from src.repositories.emp_master_repository import EmpMasterRepository from src.repositories.emp_master_repository import EmpMasterRepository
from src.repositories.emp_chg_inst_repository import EmpChgInstRepository from src.repositories.emp_chg_inst_repository import EmpChgInstRepository
from src.repositories.generic_kbn_mst_repository import GenericKbnMstRepository
from src.model.internal.master_mainte_csv import MasterMainteCSVItems from src.model.internal.master_mainte_csv import MasterMainteCSVItems
from src.model.internal.master_mainte_emp_chg_inst_function import NewEmpChgInstFunction from src.model.internal.master_mainte_emp_chg_inst_function import NewEmpChgInstFunction
from src.model.internal.master_mainte_emp_chg_inst_function import ChangeEmpChgInstFunction from src.model.internal.master_mainte_emp_chg_inst_function import ChangeEmpChgInstFunction
@ -38,6 +39,7 @@ class MasterMainteService(BaseService):
'emp_master_repository': EmpMasterRepository, 'emp_master_repository': EmpMasterRepository,
'bu_master_repository': BuMasterRepository, 'bu_master_repository': BuMasterRepository,
'emp_chginst_repository': EmpChgInstRepository, 'emp_chginst_repository': EmpChgInstRepository,
'generic_kbn_mst_repository': GenericKbnMstRepository,
} }
CLIENTS = { CLIENTS = {
@ -48,6 +50,7 @@ class MasterMainteService(BaseService):
emp_master_repository: EmpMasterRepository emp_master_repository: EmpMasterRepository
bu_master_repository: BuMasterRepository bu_master_repository: BuMasterRepository
emp_chginst_repository: EmpChgInstRepository emp_chginst_repository: EmpChgInstRepository
generic_kbn_mst_repository: GenericKbnMstRepository
s3_client: S3Client s3_client: S3Client
def __init__(self, repositories: dict[str, BaseRepository], clients: dict[str, AWSAPIClient]) -> None: def __init__(self, repositories: dict[str, BaseRepository], clients: dict[str, AWSAPIClient]) -> None:
@ -56,6 +59,7 @@ class MasterMainteService(BaseService):
self.emp_master_repository = repositories['emp_master_repository'] self.emp_master_repository = repositories['emp_master_repository']
self.bu_master_repository = repositories['bu_master_repository'] self.bu_master_repository = repositories['bu_master_repository']
self.emp_chginst_repository = repositories['emp_chginst_repository'] self.emp_chginst_repository = repositories['emp_chginst_repository']
self.generic_kbn_mst_repository = repositories['generic_kbn_mst_repository']
self.s3_client = clients['s3_client'] self.s3_client = clients['s3_client']
def prepare_mainte_csv_up_view(self, def prepare_mainte_csv_up_view(self,
@ -77,7 +81,8 @@ class MasterMainteService(BaseService):
self.mst_inst_repository, self.mst_inst_repository,
self.emp_master_repository, self.emp_master_repository,
self.bu_master_repository, self.bu_master_repository,
self.emp_chginst_repository self.emp_chginst_repository,
self.generic_kbn_mst_repository
) )
error_message_list = [] error_message_list = []
@ -148,8 +153,8 @@ class MasterMainteService(BaseService):
def copy_data_real_to_dummy(self) -> TableOverrideViewModel: def copy_data_real_to_dummy(self) -> TableOverrideViewModel:
try: try:
self.emp_chginst_repository.to_jst()
self.emp_chginst_repository.begin() self.emp_chginst_repository.begin()
self.emp_chginst_repository.to_jst()
self.emp_chginst_repository.delete_dummy_table() self.emp_chginst_repository.delete_dummy_table()
self.emp_chginst_repository.copy_real_to_dummy() self.emp_chginst_repository.copy_real_to_dummy()
self.emp_chginst_repository.commit() self.emp_chginst_repository.commit()

View File

@ -17,3 +17,10 @@ def get_session(key: str) -> UserSession:
except UserSession.DoesNotExist as e: except UserSession.DoesNotExist as e:
logger.debug(f'セッション取得失敗:{e}') logger.debug(f'セッション取得失敗:{e}')
return None return None
def delete_session (session: UserSession):
try:
session.delete()
return
except:
return

View File

@ -63,6 +63,7 @@ LOGOUT_REASON_BACKUP_PROCESSING = 'dump_processing'
LOGOUT_REASON_NOT_LOGIN = 'not_login' LOGOUT_REASON_NOT_LOGIN = 'not_login'
LOGOUT_REASON_DB_ERROR = 'db_error' LOGOUT_REASON_DB_ERROR = 'db_error'
LOGOUT_REASON_UNEXPECTED = 'unexpected' LOGOUT_REASON_UNEXPECTED = 'unexpected'
LOGOUT_REASON_LOGIN_FAILED_LIMIT_EXCEEDED = 'login_failed_limit_exceeded'
LOGOUT_REASON_MESSAGE_MAP = { LOGOUT_REASON_MESSAGE_MAP = {
LOGOUT_REASON_DO_LOGOUT: 'Logoutしました。', LOGOUT_REASON_DO_LOGOUT: 'Logoutしました。',
@ -72,7 +73,8 @@ LOGOUT_REASON_MESSAGE_MAP = {
LOGOUT_REASON_BACKUP_PROCESSING: 'バックアップ取得を開始しました。<br>日次バッチ更新が終了するまでマスターメンテは使用できません', LOGOUT_REASON_BACKUP_PROCESSING: 'バックアップ取得を開始しました。<br>日次バッチ更新が終了するまでマスターメンテは使用できません',
LOGOUT_REASON_NOT_LOGIN: 'Loginしてからページにアクセスしてください。', LOGOUT_REASON_NOT_LOGIN: 'Loginしてからページにアクセスしてください。',
LOGOUT_REASON_DB_ERROR: 'DB接続に失敗しました。<br>再度Loginするか、<br>管理者にお問い合わせください。', LOGOUT_REASON_DB_ERROR: 'DB接続に失敗しました。<br>再度Loginするか、<br>管理者にお問い合わせください。',
LOGOUT_REASON_UNEXPECTED: '予期しないエラーが発生しました。<br>再度Loginするか、<br>管理者に問い合わせてください。' LOGOUT_REASON_UNEXPECTED: '予期しないエラーが発生しました。<br>再度Loginするか、<br>管理者に問い合わせてください。',
LOGOUT_REASON_LOGIN_FAILED_LIMIT_EXCEEDED: 'ログイン失敗回数の上限を超えましたので<br>アカウントをロックしました。<br>管理者に連絡してください'
} }
# 新規施設担当者登録CSV(マスターメンテ) # 新規施設担当者登録CSV(マスターメンテ)
@ -80,6 +82,7 @@ NEW_INST_EMP_CSV_LOGICAL_NAMES = [
'施設コード', '施設コード',
'施設名', '施設名',
'領域コード', '領域コード',
'担当者種別コード',
'MUID', 'MUID',
'担当者名(姓)', '担当者名(姓)',
'担当者名(名)', '担当者名(名)',
@ -93,18 +96,20 @@ CSV_NEW_INST_CD_COL_NO = 0
CSV_NEW_INST_NAME_COL_NO = 1 CSV_NEW_INST_NAME_COL_NO = 1
# 領域コードの列No # 領域コードの列No
CSV_NEW_TA_CD_COL_NO = 2 CSV_NEW_TA_CD_COL_NO = 2
# 担当者種別コードの列No
CSV_NEW_EMP_CHG_TYPE_CD_COL_NO = 3
# MUIDの列No # MUIDの列No
CSV_NEW_EMP_CD_COL_NO = 3 CSV_NEW_EMP_CD_COL_NO = 4
# 担当者名の列No # 担当者名の列No
CSV_NEW_EMP_NAME_FAMILY_COL_NO = 4 CSV_NEW_EMP_NAME_FAMILY_COL_NO = 5
# 担当者名の列No # 担当者名の列No
CSV_NEW_EMP_NAME_FIRST_COL_NO = 5 CSV_NEW_EMP_NAME_FIRST_COL_NO = 6
# ビジネスユニットコードの列No # ビジネスユニットコードの列No
CSV_NEW_BU_CD_COL_NO = 6 CSV_NEW_BU_CD_COL_NO = 7
# 適用開始日の列No # 適用開始日の列No
CSV_NEW_START_DATE = 7 CSV_NEW_START_DATE = 8
# 適用終了日の列No # 適用終了日の列No
CSV_NEW_END_DATE = 8 CSV_NEW_END_DATE = 9
# 施設担当者変更登録CSV(マスターメンテ) # 施設担当者変更登録CSV(マスターメンテ)
CHANGE_INST_CSV_LOGICAL_NAMES = [ CHANGE_INST_CSV_LOGICAL_NAMES = [
@ -116,6 +121,7 @@ CHANGE_INST_CSV_LOGICAL_NAMES = [
'施設名', '施設名',
'領域コード', '領域コード',
'説明', '説明',
'担当者種別コード',
'MUID', 'MUID',
'担当者名', '担当者名',
'施設担当_開始日', '施設担当_開始日',
@ -139,18 +145,20 @@ CSV_CHANGE_INST_NAME_COL_NO = 5
CSV_CHANGE_TA_CD_COL_NO = 6 CSV_CHANGE_TA_CD_COL_NO = 6
# 説明の列No # 説明の列No
CSV_CHANGE_EXPLAIN_COL_NO = 7 CSV_CHANGE_EXPLAIN_COL_NO = 7
# 担当者種別コード
CSV_CHANGE_EMP_CHG_TYPE_CD_COL_NO = 8
# MUIDの列No # MUIDの列No
CSV_CHANGE_EMP_CD_COL_NO = 8 CSV_CHANGE_EMP_CD_COL_NO = 9
# 担当者名の列No # 担当者名の列No
CSV_CHANGE_EMP_FULL_NAME_COL_NO = 9 CSV_CHANGE_EMP_FULL_NAME_COL_NO = 10
# 施設担当_開始日の列No # 施設担当_開始日の列No
CSV_CHANGE_INST_EMP_START_DATE_COL_NO = 10 CSV_CHANGE_INST_EMP_START_DATE_COL_NO = 11
# 施設担当_終了日の列No # 施設担当_終了日の列No
CSV_CHANGE_INST_EMP_END_DATE_COL_NO = 11 CSV_CHANGE_INST_EMP_END_DATE_COL_NO = 12
# 終了日の変更の列No # 終了日の変更の列No
CSV_CHANGE_CHANGE_END_DATE_COL_NO = 12 CSV_CHANGE_CHANGE_END_DATE_COL_NO = 13
# コメントの列No # コメントの列No
CSV_CHANGE_COMMENT = 13 CSV_CHANGE_COMMENT = 14
# CSVアップロードテーブル名(マスターメンテ) # CSVアップロードテーブル名(マスターメンテ)
CSV_REAL_TABLE_NAME = '本番テーブル' CSV_REAL_TABLE_NAME = '本番テーブル'
@ -162,6 +170,7 @@ MENTE_CSV_DOWNLOAD_EXTRACT_COLUMNS = [
'inst_cd', 'inst_cd',
'inst_name', 'inst_name',
'ta_cd', 'ta_cd',
'emp_chg_type_cd',
'emp_cd', 'emp_cd',
'emp_name_full', 'emp_name_full',
'bu_cd', 'bu_cd',
@ -178,6 +187,7 @@ MENTE_CSV_DOWNLOAD_HEADER = [
'施設コード', '施設コード',
'施設名', '施設名',
'領域コード', '領域コード',
'担当者種別コード',
'MUID', 'MUID',
'担当者名', '担当者名',
'ビジネスユニットコード', 'ビジネスユニットコード',
@ -207,3 +217,6 @@ DISPLAY_USER_STOP_DIV_SHORT = {
'03': '特定項目停止', '03': '特定項目停止',
'04': '全DM停止' '04': '全DM停止'
} }
# ログイン失敗回数上限(保守ユーザー)
LOGIN_FAIL_LIMIT = 10

View File

@ -1,19 +1,19 @@
<meta charset="utf-8"> <meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="format-detection" content="telephone=no, address=no" http-equiv="content-type" content="text/html; charset=utf-8" /> <meta http-equiv="content-type" content="text/html; charset=utf-8" />
<meta name="format-detection" content="telephone=no, address=no" />
<title>{{subtitle}}</title> <title>{{subtitle}}</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/css/bootstrap.min.css" integrity="sha384-GLhlTQ8iRABdZLl6O3oVMWSktQOp6b7In1Zl3/Jr59b6EGGoI1aFkw7cmDA6j6gD" crossorigin="anonymous"> <link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/css/bootstrap.min.css" crossorigin="anonymous">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.10.2/font/bootstrap-icons.css" integrity="sha384-b6lVK+yci+bfDmaY1u0zE8YYJt0TZxLEAFyYSLHId4xoVvsrQu3INevFKo+Xir8e" crossorigin="anonymous"> <link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/flatpickr/dist/flatpickr.min.css" crossorigin="anonymous">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/flatpickr/dist/flatpickr.min.css"> <link rel="stylesheet" href="/static/css/main_theme.css" integrity="sha384-k0YpJBvcGJXdJlt8yqnhPYuU7tHQfdv4C80KDdGf72dzzAVWUp+ek+A1cqOV5o4t">
<link rel="stylesheet" href="/static/css/main_theme.css"> <link rel="stylesheet" href="/static/css/pagenation.css" integrity="sha384-CDhOHftwvzWdI3cmvl0PESIdU5i0qjWbz8+HE9poJscglyrB0jzXZpVkb51xigty">
<link rel="stylesheet" href="/static/css/pagenation.css"> <link rel="stylesheet" href="/static/css/datepicker.css" integrity="sha384-I3gPqeqj0wDLoF6oS/OuMJ5C+BI210zLrJvQvNRVdvyyI9+qrraaQK2L9vvhTA8x">
<link rel="stylesheet" href="/static/css/datepicker.css"> <link rel="stylesheet" href="/static/css/loading.css" integrity="sha384-f9FRohCbLarb6Z91FWRbfNIIYYLx/5Kxqw19CB9Z0GxXunS9j0gRWWl50LayDAG7">
<link rel="stylesheet" href="/static/css/loading.css"> <script src="https://cdn.jsdelivr.net/npm/jquery@3.6.3/dist/jquery.min.js" crossorigin="anonymous"></script>
<script src="https://code.jquery.com/jquery-3.6.3.min.js" integrity="sha256-pvPw+upLPUjgMXY0G+8O0xUf+/Im1MZjXxxgOcBQBXU=" crossorigin="anonymous"></script> <script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/js/bootstrap.bundle.min.js" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/js/bootstrap.bundle.min.js" integrity="sha384-w76AqPfDkMBDXo30jS1Sgez6pr3x5MlQ1ZAGC+nuZB+EYdgRZgiwxhTBTkF7CXvN" crossorigin="anonymous"></script> <script src="https://cdn.jsdelivr.net/npm/paginationjs@2.5.0/dist/pagination.min.js" crossorigin="anonymous"></script>
<script src="https://pagination.js.org/dist/2.5.0/pagination.min.js" crossorigin="anonymous"></script> <script src="https://cdn.jsdelivr.net/npm/flatpickr@4.6.13/dist/flatpickr.min.js" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/flatpickr@4.6.13/dist/flatpickr.min.js"></script> <script src="https://cdn.jsdelivr.net/npm/flatpickr/dist/l10n/ja.min.js" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/flatpickr/dist/l10n/ja.min.js"></script> <script src="/static/function/businessLogicScript.js" integrity="sha384-ytd1o7Rx4BPzjO3RpzR9fW/Z4avGzS7+BRPZVUsQp5X4zXB6xdZpR47/En1mNl7s" crossorigin="anonymous"></script>
<script src="/static/function/businessLogicScript.js"></script> <script src="/static/lib/fixed_midashi.js" integrity="sha384-mCd6L3DNaLgUWyH051BywJfzlVavCkK6F0wbMqG+j7jAq174Uf7HJdq3H4wxCJKs" crossorigin="anonymous"></script>
<script src="/static/lib/fixed_midashi.js"></script>

View File

@ -81,6 +81,17 @@
</td> </td>
</tr> </tr>
<!-- 検索フォーム2行目 --> <!-- 検索フォーム2行目 -->
<tr>
<!-- 担当者種別コード -->
<td class="searchLabelTd">担当者種別コード:</td>
<td class="searchInputTd">
<input class="searchTextbox" type="text" name="ctrl_emp_chg_type_cd" value="{{mainte_csv_dl.emp_chg_type_cd | safe}}" maxlength='10'
onchange="formBtDisabled()"
oninput="formBtDisabled()"
>
</td>
</tr>
<!-- 検索フォーム3行目 -->
<tr> <tr>
<!-- 適用期間内 --> <!-- 適用期間内 -->
<td class="searchLabelTd">適用期間内:</td> <td class="searchLabelTd">適用期間内:</td>
@ -117,7 +128,7 @@
> >
</td> </td>
</tr> </tr>
<!-- 検索フォーム3行目 --> <!-- 検索フォーム4行目 -->
<tr> <tr>
<!-- 対象テーブル --> <!-- 対象テーブル -->
<td class="searchLabelTd">対象テーブル:</td> <td class="searchLabelTd">対象テーブル:</td>
@ -160,7 +171,7 @@
> >
</td> </td>
</tr> </tr>
<!-- 検索フォーム4行目 --> <!-- 検索フォーム5行目 -->
<tr> <tr>
<!-- 検索、クリアボタン --> <!-- 検索、クリアボタン -->
<td class="searchButtonTd" colspan="6"> <td class="searchButtonTd" colspan="6">

View File

@ -15,61 +15,8 @@
{{logout.reason}} {{logout.reason}}
{% endautoescape %} {% endautoescape %}
</p> </p>
<!-- <?php
// getが来ておらず理由がわからない場合
if(!(isset($_GET['reason']))){
$userflg = null;
// ログアウトボタンを押されたとき
} else if($_GET['reason'] == 'logoutBtn'){
?>
<p class="logout_p"><?php echo $logoutMsg ?></p>
<?php
// ログイン失敗時に表示
} else if($_GET['reason'] == 'loginErr'){
?>
<p class="logout_p"><?php echo $loginErrMsg ?></p>
<?php
// 日時バッチ処理中エラー時
} else if($_GET['reason'] == 'batchProcess'){
?>
<p class="logout_p"><?php echo $batchProcessMsg ?></p>
<?php
// マスターメンテ日時バッチ処理中エラー時
} else if($_GET['reason'] == 'batchProcessNewInstEmpRegist'){
?>
<p class="logout_p"><?php echo $batchProcessNewInstEmpRegistMsg ?></p>
<?php
// どっちのユーザーでログインしたかわからないとき
} else if (!(isset($userflg))) {
} else {
$userflg = null;
?>
<p class="logout_p"><?php echo $unexpectedErrMsg ?></p>
<?php
}
?> -->
<br><br><br> <br><br><br>
<p class="logout_p"><a href="{{ logout.redirect_to }}">{{logout.link_text}}</a></p> <p class="logout_p"><a href="{{ logout.redirect_to }}">{{logout.link_text}}</a></p>
<!-- MeDaCA機能メニューへ -->
<!-- <p class="logout_p"><a href="redirect_to">Login画面に戻る</a></p> -->
<!-- <?php
if (!(isset($userflg))) {
?>
<p class="logout_p"><a href="<?php echo $groupwarePath ?>"><?php echo $groupwareBackMsg ?></a></p>
<?php
} else if($userflg == 1){
?>
<p class="logout_p"><a href="<?php echo $maintLoginPath ?>"><?php echo $loginBackMsg ?></a></p>
<?php
} else {
?>
<p class="logout_p"><a href="<?php echo $groupwarePath ?>"><?php echo $groupwareBackMsg ?></a></p>
<?php
}
?> -->
</div> </div>
</body> </body>
</html> </html>

View File

@ -0,0 +1,13 @@
tests/*
.coverage
.env
.env.example
.report/*
.vscode/*
.pytest_cache/*
*/__pychache__/*
Dockerfile
pytest.ini
README.md
*.sql
*.gz

View File

@ -0,0 +1,7 @@
DB_HOST=************
DB_PORT=3306
DB_USERNAME=************
DB_PASSWORD=************
DB_SCHEMA=src05
DUMP_FILE_S3_PATH=*******************
LOG_LEVEL=INFO

16
ecs/restore-dbdump/.gitignore vendored Normal file
View File

@ -0,0 +1,16 @@
.vscode/settings.json
.env
# python
__pycache__
# python test
.pytest_cache
.coverage
.report/
# mysql config file
my.cnf
# compress file
*.gz

17
ecs/restore-dbdump/.vscode/launch.json vendored Normal file
View File

@ -0,0 +1,17 @@
{
// IntelliSense 使
//
// : https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "(DEBUG)restore dbdump",
"type": "python",
"request": "launch",
"program": "entrypoint.py",
"console": "integratedTerminal",
"justMyCode": true
}
]
}

View File

@ -0,0 +1,31 @@
{
"[python]": {
"editor.defaultFormatter": null,
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.organizeImports": true
}
},
//
"python.defaultInterpreterPath": "<pythonインタプリターのパス>",
"python.linting.lintOnSave": true,
"python.linting.enabled": true,
"python.linting.pylintEnabled": false,
"python.linting.flake8Enabled": true,
"python.linting.flake8Args": [
"--max-line-length=200",
"--ignore=F541"
],
"python.formatting.provider": "autopep8",
"python.formatting.autopep8Path": "autopep8",
"python.formatting.autopep8Args": [
"--max-line-length", "200",
"--ignore=F541"
],
"python.testing.pytestArgs": [
"tests/batch/ultmarc"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
}

View File

@ -0,0 +1,45 @@
FROM python:3.12-slim-bookworm
ENV TZ="Asia/Tokyo"
# pythonの標準出力をバッファリングしないフラグ
ENV PYTHONUNBUFFERED=1
# pythonのバイトコードを生成しないフラグ
ENV PYTHONDONTWRITEBYTECODE=1
WORKDIR /usr/src/app
COPY Pipfile Pipfile.lock ./
# mysql-apt-config をdpkgでインストールする際に標準出力に渡す文字列ファイルをコピー
COPY mysql_dpkg_selection.txt ./
# 必要なパッケージインストール
RUN apt update && apt install -y less vim curl wget gzip unzip sudo lsb-release
# mysqlをインストール
RUN \
wget https://dev.mysql.com/get/mysql-apt-config_0.8.29-1_all.deb && \
apt install -y gnupg && \
dpkg -i mysql-apt-config_0.8.29-1_all.deb < mysql_dpkg_selection.txt && \
apt update && \
apt install -y mysql-client
# aws cli v2 のインストール
RUN \
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && \
unzip awscliv2.zip && \
sudo ./aws/install
# python関連のライブラリインストール
RUN \
pip install --upgrade pip wheel setuptools && \
pip install pipenv --no-cache-dir && \
pipenv install --system --deploy && \
pip uninstall -y pipenv virtualenv-clone virtualenv
# パッケージのセキュリティアップデートのみを適用するコマンドを実行
RUN \
apt install -y unattended-upgrades && \
unattended-upgrades
COPY src ./src
COPY entrypoint.py entrypoint.py
CMD ["python", "entrypoint.py"]

View File

@ -0,0 +1,16 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
[dev-packages]
autopep8 = "*"
flake8 = "*"
[requires]
python_version = "3.12"
[pipenv]
allow_prereleases = true

63
ecs/restore-dbdump/Pipfile.lock generated Normal file
View File

@ -0,0 +1,63 @@
{
"_meta": {
"hash": {
"sha256": "2f7808325e11704ced6ad10c85e1d583663a03d7ccabaa9696ab1fe133a6b30c"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.12"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {},
"develop": {
"autopep8": {
"hashes": [
"sha256:89440a4f969197b69a995e4ce0661b031f455a9f776d2c5ba3dbd83466931758",
"sha256:ce8ad498672c845a0c3de2629c15b635ec2b05ef8177a6e7c91c74f3e9b51128"
],
"index": "pypi",
"markers": "python_version >= '3.9'",
"version": "==2.3.2"
},
"flake8": {
"hashes": [
"sha256:1cbc62e65536f65e6d754dfe6f1bada7f5cf392d6f5db3c2b85892466c3e7c1a",
"sha256:c586ffd0b41540951ae41af572e6790dbd49fc12b3aa2541685d253d9bd504bd"
],
"index": "pypi",
"markers": "python_full_version >= '3.8.1'",
"version": "==7.1.2"
},
"mccabe": {
"hashes": [
"sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325",
"sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"
],
"markers": "python_version >= '3.6'",
"version": "==0.7.0"
},
"pycodestyle": {
"hashes": [
"sha256:46f0fb92069a7c28ab7bb558f05bfc0110dac69a0cd23c61ea0040283a9d78b3",
"sha256:6838eae08bbce4f6accd5d5572075c63626a15ee3e6f842df996bf62f6d73521"
],
"markers": "python_version >= '3.8'",
"version": "==2.12.1"
},
"pyflakes": {
"hashes": [
"sha256:1c61603ff154621fb2a9172037d84dca3500def8c8b630657d1701f026f8af3f",
"sha256:84b5be138a2dfbb40689ca07e2152deb896a65c3a3e24c251c5c62489568074a"
],
"markers": "python_version >= '3.8'",
"version": "==3.2.0"
}
}
}

View File

@ -0,0 +1,67 @@
# ダンプ復元スクリプト
## 概要
当処理は特定の機能で利用するものではなく、共通処理として要件に応じて実行することを想定している。
## 環境情報
- Python 3.9
- MySQL 8.23
- VSCode
## 環境構築
- Python の構築
- Merck_NewDWH 開発 2021 の Wiki、[Python 環境構築](https://nds-tyo.backlog.com/alias/wiki/1874930)を参照
- 「Pipenv の導入」までを行っておくこと
- 構築完了後、プロジェクト配下で以下のコマンドを実行し、Python の仮想環境を作成する
- `pipenv install --dev --python <pyenvでインストールしたpythonバージョン>`
- この手順で出力される仮想環境のパスは、後述する VSCode の設定手順で使用するため、控えておく
- MySQL の環境構築
- Windows の場合、以下のリンクからダウンロードする
- <https://dev.mysql.com/downloads/installer/>
- Docker を利用する場合、「newsdwh-tools」リポジトリの MySQL 設定を使用すると便利
- 「crm-table-to-ddl」フォルダ内で以下のコマンドを実行すると
- `docker-compose up -d`
- Docker の構築手順は、[Docker のセットアップ手順](https://nds-tyo.backlog.com/alias/wiki/1754332)を参照のこと
- データを投入する
- 立ち上げたデータベースに「src05」スキーマを作成する
- [ローカル開発用データ](https://ndstokyo.sharepoint.com/:f:/r/sites/merck-new-dwh-team/Shared%20Documents/03.NewDWH%E6%A7%8B%E7%AF%89%E3%83%95%E3%82%A7%E3%83%BC%E3%82%BA3/02.%E9%96%8B%E7%99%BA/90.%E9%96%8B%E7%99%BA%E5%85%B1%E6%9C%89/%E3%83%AD%E3%83%BC%E3%82%AB%E3%83%AB%E9%96%8B%E7%99%BA%E7%94%A8%E3%83%87%E3%83%BC%E3%82%BF?csf=1&web=1&e=VVcRUs)をダウンロードし、mysql コマンドを使用して復元する
- `mysql -h <ホスト名> -P <ポート> -u <ユーザー名> -p src05 < src05_dump.sql`
- 環境変数の設定
- 「.env.example」ファイルをコピーし、「.env」ファイルを作成する
- 環境変数を設定する。設定内容は PRJ メンバーより共有を受けてください
- VSCode の設定
- 「.vscode/recommended_settings.json」ファイルをコピーし、「settings.json」ファイルを作成する
- 「python.defaultInterpreterPath」を、Python の構築手順で作成した仮想環境のパスに変更する
## 実行
- VSCode 上で「F5」キーを押下すると、バッチ処理が起動する。
- 「entrypoint.py」が、バッチ処理のエントリーポイント。
- 実際の処理は、「src/restore_backup.py」で行っている。
## フォルダ構成
```txt
.
├── .env.example -- ローカル実行用の環境変数のサンプル値。
├── .dockerignore -- docker build時のコンテキストに含めるファイルの抑制リスト
├── .gitignore -- Git差分管理除外リスト
├── Dockerfile -- Dockerイメージ作成用
├── Pipfile -- pythonの依存関係管理
├── Pipfile.lock -- 依存関係バージョン固定
├── README.md -- 当ファイル
├── entrypoint.py -- エントリーポイントとなるファイル
├── mysql_dpkg_selection.txt -- Dockerイメージでdpkgを使うときに外部から注入する選択値
└── src -- ソースコードフォルダ
├── logging
│ └── get_logger.py -- ロガー
├── restore_backup.py -- dump復元処理本体
└── system_var
├── constants.py -- 定数ファイル
└── environment.py -- 環境変数ファイル
```

View File

@ -0,0 +1,10 @@
"""実消化&アルトマーク DBダンプ復元のエントリーポイント"""
from src import restore_backup
if __name__ == '__main__':
try:
exit(restore_backup.exec())
except Exception:
# エラーが起きても、正常系のコードで返す。
# エラーが起きた事実はbatch_process内でログを出す。
exit(0)

View File

@ -0,0 +1,3 @@
1
1
4

View File

View File

@ -0,0 +1,37 @@
import logging
from src.system_var.environment import LOG_LEVEL
# boto3関連モジュールのログレベルを事前に個別指定し、モジュール内のDEBUGログの表示を抑止する
for name in ["boto3", "botocore", "s3transfer", "urllib3"]:
logging.getLogger(name).setLevel(logging.WARNING)
def get_logger(log_name: str) -> logging.Logger:
"""一意のログ出力モジュールを取得します。
Args:
log_name (str): ロガー名
Returns:
_type_: _description_
"""
logger = logging.getLogger(log_name)
level = logging.getLevelName(LOG_LEVEL)
if not isinstance(level, int):
level = logging.INFO
logger.setLevel(level)
if not logger.hasHandlers():
handler = logging.StreamHandler()
logger.addHandler(handler)
formatter = logging.Formatter(
'%(name)s\t[%(levelname)s]\t%(asctime)s\t%(message)s',
'%Y-%m-%d %H:%M:%S'
)
for handler in logger.handlers:
handler.setFormatter(formatter)
return logger

View File

@ -0,0 +1,108 @@
"""ダンプ復元スクリプト"""
import os
import subprocess
import textwrap
from src.logging.get_logger import get_logger
from src.system_var import constants, environment
logger = get_logger('ダンプ復元スクリプト')
def exec():
try:
logger.info('ダンプ復元スクリプト:開始')
# 事前処理(共通処理としては空振りする)
_pre_exec()
# メイン処理
# MySQL接続情報を作成する
my_cnf_file_content = f"""
[client]
user={environment.DB_USERNAME}
password={environment.DB_PASSWORD}
host={environment.DB_HOST}
"""
# my.cnfファイルのパス
my_cnf_path = os.path.join('my.cnf')
# my.cnfファイルを生成する
with open(my_cnf_path, 'w') as f:
f.write(textwrap.dedent(my_cnf_file_content)[1:-1])
os.chmod(my_cnf_path, 0o444)
# DBへの接続エラーを早期に検出するため、事前にMySQLサーバーに接続
mysql_pre_process = subprocess.Popen(
['mysql', f'--defaults-file={my_cnf_path}', '-P', f"{environment.DB_PORT}",
environment.DB_SCHEMA, '-N', '-e', 'SELECT 1;'],
stderr=subprocess.PIPE
)
_, error = mysql_pre_process.communicate()
if mysql_pre_process.returncode != 0:
logger.error(
f'MySQLサーバーへの接続に失敗しました。{"" if error is None else error.decode("utf-8")}')
return constants.BATCH_EXIT_CODE_SUCCESS
# 復元対象のダンプファイルを特定
s3_file_path = environment.DUMP_FILE_S3_PATH
# aws s3 cpコマンドを実行してdumpファイルをローカルにダウンロードする
s3_cp_process = subprocess.Popen(
['aws', 's3', 'cp', s3_file_path, './dump.gz'], stderr=subprocess.PIPE)
_, error = s3_cp_process.communicate()
if s3_cp_process.returncode != 0:
logger.error(
f'`aws s3 cp`実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
return constants.BATCH_EXIT_CODE_SUCCESS
# S3コマンドの標準エラーはクローズしておく
s3_cp_process.stderr.close()
# gzipコマンドを実行してdumpファイルを解凍する
gzip_process = subprocess.Popen(
['gunzip', '-c', './dump.gz'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# mysqlコマンドを実行し、dumpを復元する
mysql_process = subprocess.Popen(
['mysql', f'--defaults-file={my_cnf_path}', '-P',
f"{environment.DB_PORT}", environment.DB_SCHEMA],
stdin=gzip_process.stdout, stderr=subprocess.PIPE
)
# gzipの標準出力をmysqlに接続したため、標準出力をクローズする
gzip_process.stdout.close()
_, error = mysql_process.communicate()
if mysql_process.returncode != 0:
logger.error(
f'コマンド実行時にエラーが発生しました。{"" if error is None else error.decode("utf-8")}')
return constants.BATCH_EXIT_CODE_SUCCESS
# 事後処理(共通処理としては空振りする)
_post_exec()
logger.info('[NOTICE]ダンプ復元スクリプト:終了(正常終了)')
return constants.BATCH_EXIT_CODE_SUCCESS
except Exception as e:
logger.exception(f'ダンプ復元スクリプト中に想定外のエラーが発生しました :{e}')
return constants.BATCH_EXIT_CODE_SUCCESS
def _pre_exec():
"""
ダンプ復元 事前処理
共通機能としては事前処理を実装しない
事前処理が必要なダンプ復元処理を実装する場合当ロジックをコピーする
"""
pass
def _post_exec():
"""
ダンプ復元 事後処理
共通機能としては事後処理を実装しない
事後処理が必要なダンプ復元処理を実装する場合当ロジックをコピーする
"""
pass

Some files were not shown because too many files have changed in this diff Show More