first commit
This commit is contained in:
commit
00cdae8d88
7
.env
Normal file
7
.env
Normal file
@ -0,0 +1,7 @@
|
||||
GOOGLE_API_KEY=AIzaSyALoFs_oS8NdOLKaOi0lqL-hOQ9MT4c6e4
|
||||
PLACE_ID=ChIJl_j1m6v71y0RzcEBmBCj8tg
|
||||
DB_HOST=192.169.0.10
|
||||
DB_PORT=5432
|
||||
DB_NAME=postgres
|
||||
DB_USER=admin_popcornsales
|
||||
DB_PASSWORD=Jempol&1992Mutu_popcornsales
|
||||
115
README.md
Normal file
115
README.md
Normal file
@ -0,0 +1,115 @@
|
||||
# Google Maps Review Crawler
|
||||
|
||||
A Python-based automation tool that fetches customer reviews from the Google Business Profile API and synchronizes them to a PostgreSQL database.
|
||||
|
||||
## Features
|
||||
- **OAuth 2.0 Authentication**: Connects directly to your Google Business Profile to fetch *all* reviews (bypassing the standard Places API 5-review limit).
|
||||
- **Multi-Outlet Support**: Automatically queries the `master_outlet` table for active `google_business_id`s and iterates through all your locations.
|
||||
- **Original Language Extraction**: Strips automated "Translated by Google" annotations to ensure only authentic, original review text is saved.
|
||||
- **Rolling Window Filter**: Only processes reviews published within the last 3 months (90 days) to optimize API calls.
|
||||
- **UPSERT Logic**: Safely updates existing database records (e.g., if a customer changes their rating or text) without creating duplicates.
|
||||
- **Automated Scheduler**: Includes a background daemon script (`schedule_crawler.py`) to run automatically every hour.
|
||||
|
||||
---
|
||||
|
||||
## Prerequisites
|
||||
- Python 3.9+
|
||||
- PostgreSQL database
|
||||
- A Google Cloud Project with the **Google My Business API (v4)** enabled.
|
||||
- OAuth 2.0 Client credentials downloaded as `client_secret.json`.
|
||||
|
||||
---
|
||||
|
||||
## 1. Installation & Environment Setup
|
||||
|
||||
1. **Clone the repository** (or navigate to the folder).
|
||||
2. **Create a virtual environment**:
|
||||
```bash
|
||||
python -m venv venv
|
||||
source venv/bin/activate
|
||||
```
|
||||
3. **Install dependencies**:
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
4. **Configure Database Credentials**:
|
||||
Create a `.env` file in the root directory:
|
||||
```env
|
||||
DB_HOST=192.169.0.10
|
||||
DB_PORT=5432
|
||||
DB_NAME=your_db_name
|
||||
DB_USER=your_db_user
|
||||
DB_PASSWORD=your_db_password
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 2. Authentication (One-Time Setup)
|
||||
|
||||
The Google Business Profile API requires explicit permission to read your reviews.
|
||||
|
||||
1. Ensure your `client_secret.json` from Google Cloud Console is in the project folder.
|
||||
2. Run the authorization script:
|
||||
```bash
|
||||
python authorize.py
|
||||
```
|
||||
3. A browser window will open. Log in with the Google Account that manages your Business Profiles.
|
||||
4. Once authorized, a `token.json` file will be created. The crawler will automatically use and refresh this token moving forward.
|
||||
|
||||
---
|
||||
|
||||
## 3. Database Setup & Mapping
|
||||
|
||||
The crawler maps Google locations to your database using `google_business_id`.
|
||||
|
||||
1. **Find your Location IDs**:
|
||||
Run the helper script to list all active stores managed by your Google Account:
|
||||
```bash
|
||||
python list_locations.py
|
||||
```
|
||||
*Note: This will output the Store Code and the long numeric Location ID.*
|
||||
|
||||
2. **Update your Database**:
|
||||
Insert the `Location ID` into the `google_business_id` column of your `master_outlet` table.
|
||||
|
||||
---
|
||||
|
||||
## 4. Running the Crawler
|
||||
|
||||
### Manual Execution
|
||||
To run the crawler once immediately:
|
||||
```bash
|
||||
python crawler.py
|
||||
```
|
||||
|
||||
### Automated (Hourly Scheduler)
|
||||
To run the crawler continuously in the background (runs once every hour):
|
||||
```bash
|
||||
chmod +x run_hourly.sh
|
||||
./run_hourly.sh
|
||||
```
|
||||
|
||||
- **Scheduler Logs**: `tail -f scheduler.log` (monitors the hourly heartbeat)
|
||||
- **Crawler Logs**: `tail -f crawler.log` (monitors the specific reviews being upserted)
|
||||
|
||||
To stop the background scheduler:
|
||||
```bash
|
||||
pkill -f schedule_crawler.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Database Schema (`google_review`)
|
||||
|
||||
Built automatically by `database.py`:
|
||||
- `id` (SERIAL PRIMARY KEY)
|
||||
- `review_id` (TEXT UNIQUE)
|
||||
- `place_id` (TEXT) - *Legacy column, nullable*
|
||||
- `original_text` (TEXT) - *The clean, untranslated review text*
|
||||
- `author_display_name` (TEXT)
|
||||
- `publish_time` (TIMESTAMP)
|
||||
- `rating` (INTEGER)
|
||||
- `outlet_code` (VARCHAR) - *Foreign Key linked to master_outlet.popcorn_code*
|
||||
- `language` (VARCHAR)
|
||||
- `created_at` (TIMESTAMP)
|
||||
- `updated_at` (TIMESTAMP)
|
||||
BIN
__pycache__/database.cpython-312.pyc
Normal file
BIN
__pycache__/database.cpython-312.pyc
Normal file
Binary file not shown.
44
authorize.py
Normal file
44
authorize.py
Normal file
@ -0,0 +1,44 @@
|
||||
import os.path
|
||||
import json
|
||||
from google.auth.transport.requests import Request
|
||||
from google.oauth2.credentials import Credentials
|
||||
from google_auth_oauthlib.flow import InstalledAppFlow
|
||||
|
||||
# Scopes for Business Profile API
|
||||
# https://developers.google.com/my-business/content/basic-setup#request_scopes
|
||||
SCOPES = [
|
||||
'https://www.googleapis.com/auth/business.manage'
|
||||
]
|
||||
|
||||
def authorize():
|
||||
creds = None
|
||||
# The file token.json stores the user's access and refresh tokens, and is
|
||||
# created automatically when the authorization flow completes for the first
|
||||
# time.
|
||||
if os.path.exists('token.json'):
|
||||
creds = Credentials.from_authorized_user_file('token.json', SCOPES)
|
||||
|
||||
# If there are no (valid) credentials available, let the user log in.
|
||||
if not creds or not creds.valid:
|
||||
if creds and creds.expired and creds.refresh_token:
|
||||
creds.refresh(Request())
|
||||
else:
|
||||
print("Starting OAuth flow...")
|
||||
print("If you are running this on a remote server without a browser, you may need to:")
|
||||
print("1. Run this script on your local machine to generate token.json")
|
||||
print("2. Upload token.json to this directory.")
|
||||
print("Or use port forwarding (e.g. ssh -L 8080:localhost:8080 user@host).")
|
||||
|
||||
flow = InstalledAppFlow.from_client_secrets_file(
|
||||
'client_secret.json', SCOPES)
|
||||
|
||||
# Using a fixed port 8080 to make port forwarding easier if needed
|
||||
creds = flow.run_local_server(port=8080, open_browser=True)
|
||||
|
||||
# Save the credentials for the next run
|
||||
with open('token.json', 'w') as token:
|
||||
token.write(creds.to_json())
|
||||
print("Token saved to token.json")
|
||||
|
||||
if __name__ == '__main__':
|
||||
authorize()
|
||||
30
check_data.py
Normal file
30
check_data.py
Normal file
@ -0,0 +1,30 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
def check_data():
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD")
|
||||
)
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("SELECT count(*) FROM google_review;")
|
||||
count = cur.fetchone()[0]
|
||||
print(f"Total reviews in 'google_review' table: {count}")
|
||||
|
||||
cur.execute("SELECT author_display_name, rating, publish_time, outlet_code, language, original_text FROM google_review ORDER BY created_at DESC LIMIT 5;")
|
||||
rows = cur.fetchall()
|
||||
for row in rows:
|
||||
print(row)
|
||||
conn.close()
|
||||
except Exception as e:
|
||||
print(f"Error checking data: {e}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
check_data()
|
||||
27
check_original.py
Normal file
27
check_original.py
Normal file
@ -0,0 +1,27 @@
|
||||
from database import get_db_connection
|
||||
|
||||
conn = get_db_connection()
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT author_display_name, rating, original_text
|
||||
FROM google_review
|
||||
WHERE original_text LIKE '%(Translated by Google)%'
|
||||
LIMIT 5;
|
||||
""")
|
||||
rows = cur.fetchall()
|
||||
print(f"Reviews still containing '(Translated by Google)': {len(rows)}")
|
||||
for r in rows:
|
||||
print(r)
|
||||
|
||||
cur.execute("""
|
||||
SELECT author_display_name, rating, original_text
|
||||
FROM google_review
|
||||
WHERE original_text IS NOT NULL AND original_text != ''
|
||||
ORDER BY updated_at DESC
|
||||
LIMIT 5;
|
||||
""")
|
||||
recent = cur.fetchall()
|
||||
print(f"\nRecent reviews:")
|
||||
for r in recent:
|
||||
print(r)
|
||||
conn.close()
|
||||
1
client_secret.json
Normal file
1
client_secret.json
Normal file
@ -0,0 +1 @@
|
||||
{"installed":{"client_id":"804823156361-3mk31f6a14r6np9usmm2mo5qnjl9lk00.apps.googleusercontent.com","project_id":"western-octagon-487809-s0","auth_uri":"https://accounts.google.com/o/oauth2/auth","token_uri":"https://oauth2.googleapis.com/token","auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs","client_secret":"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","redirect_uris":["http://localhost"]}}
|
||||
855
crawler.log
Normal file
855
crawler.log
Normal file
@ -0,0 +1,855 @@
|
||||
2026-02-19 09:24:12,016 - INFO - Starting crawl...
|
||||
2026-02-19 09:24:12,074 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 09:24:12,075 - INFO - Filtering reviews published after: 2026-02-18 02:24:12.075193 UTC
|
||||
2026-02-19 09:24:14,933 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-19 09:24:14,934 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 09:25:06,687 - INFO - Starting crawl...
|
||||
2026-02-19 09:25:06,743 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 09:25:06,744 - INFO - Filtering reviews published after: 2026-02-18 02:25:06.743971+00:00 UTC
|
||||
2026-02-19 09:25:06,744 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 09:25:06,860 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:06,860 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 09:25:06,960 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:06,960 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 09:25:07,079 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,079 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 09:25:07,188 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,188 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 09:25:07,291 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,291 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 09:25:07,394 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,395 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 09:25:07,492 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,492 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 09:25:07,609 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,609 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 09:25:07,713 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,713 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 09:25:07,820 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,821 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 09:25:07,926 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:07,926 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 09:25:08,029 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:08,030 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 09:25:08,150 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:08,150 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 09:25:08,479 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:08,479 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 09:25:08,578 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:08,578 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 09:25:08,683 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:08,684 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 09:25:08,792 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:08,792 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 09:25:08,894 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:08,894 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 09:25:09,003 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:09,003 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 09:25:09,108 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:09,108 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 09:25:09,216 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:09,216 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 09:25:09,330 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:09,331 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 09:25:09,690 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:09,690 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 09:25:09,795 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:25:09,795 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-19 09:25:09,795 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 09:26:11,250 - INFO - Starting crawl...
|
||||
2026-02-19 09:26:11,298 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 09:26:11,298 - INFO - Filtering reviews published after: 2026-02-18 02:26:11.298947+00:00 UTC
|
||||
2026-02-19 09:26:11,299 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 09:26:11,405 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:11,405 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 09:26:11,506 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:11,507 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 09:26:11,614 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:11,614 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 09:26:11,725 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:11,725 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 09:26:11,823 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:11,823 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 09:26:11,921 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:11,921 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 09:26:12,023 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,023 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 09:26:12,124 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,124 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 09:26:12,228 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,229 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 09:26:12,331 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,331 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 09:26:12,441 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,442 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 09:26:12,545 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,545 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 09:26:12,647 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,647 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 09:26:12,751 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,751 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 09:26:12,855 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,855 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 09:26:12,955 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:12,956 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 09:26:13,056 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:13,057 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 09:26:13,157 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:13,157 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 09:26:13,262 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:13,262 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 09:26:13,363 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:13,364 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 09:26:13,470 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:13,470 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 09:26:13,574 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:13,574 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 09:26:13,678 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:13,678 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 09:26:13,778 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:26:13,778 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-19 09:26:13,779 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 09:28:06,443 - INFO - Starting crawl...
|
||||
2026-02-19 09:28:06,496 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 09:28:06,496 - INFO - Filtering reviews published after: 2026-01-20 02:28:06.496653+00:00 UTC
|
||||
2026-02-19 09:28:06,496 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 09:28:06,664 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:06,664 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 09:28:06,786 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:06,787 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 09:28:06,905 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:06,905 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 09:28:07,025 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:07,025 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 09:28:07,148 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:07,148 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 09:28:07,257 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:07,257 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 09:28:07,377 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:07,377 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 09:28:07,481 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:07,481 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 09:28:07,607 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 09:28:07,607 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 09:28:07,730 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:07,730 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 09:28:07,845 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:07,845 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 09:28:07,952 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:07,952 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 09:28:08,069 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:08,069 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 09:28:08,177 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:08,178 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 09:28:08,296 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:08,296 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 09:28:08,416 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:08,416 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 09:28:08,526 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:08,526 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 09:28:08,634 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:08,634 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 09:28:08,749 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:08,749 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 09:28:08,870 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:08,870 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 09:28:08,973 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:08,973 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 09:28:09,092 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:09,092 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 09:28:09,197 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:09,197 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 09:28:09,301 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:09,301 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 09:28:09,301 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 09:28:25,472 - INFO - Starting crawl...
|
||||
2026-02-19 09:28:25,522 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 09:28:25,522 - INFO - Filtering reviews published after: 2026-01-20 02:28:25.522739+00:00 UTC
|
||||
2026-02-19 09:28:25,522 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 09:28:25,644 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:25,644 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 09:28:25,754 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:25,754 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 09:28:25,860 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:25,861 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 09:28:25,978 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:25,978 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 09:28:26,086 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:26,086 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 09:28:26,189 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:26,189 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 09:28:26,294 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:26,295 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 09:28:26,402 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:26,402 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 09:28:26,514 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 09:28:26,514 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 09:28:26,624 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:26,624 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 09:28:26,746 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:26,746 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 09:28:26,847 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:26,847 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 09:28:26,959 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:26,959 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 09:28:27,059 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:27,059 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 09:28:27,169 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:27,169 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 09:28:27,269 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:27,269 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 09:28:27,373 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:27,374 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 09:28:27,474 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:27,474 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 09:28:27,576 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:27,576 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 09:28:27,690 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 09:28:27,690 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 09:28:27,791 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:27,791 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 09:28:27,890 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:27,890 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 09:28:28,036 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:28,036 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 09:28:28,158 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 09:28:28,158 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 09:28:28,159 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 10:01:37,450 - INFO - Starting crawl...
|
||||
2026-02-19 10:01:37,520 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 10:01:37,522 - INFO - Filtering reviews published after: 2026-01-20 03:01:37.520273+00:00 UTC
|
||||
2026-02-19 10:01:37,522 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 10:01:37,726 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 10:01:37,726 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 10:01:37,902 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:37,902 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 10:01:38,090 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:38,090 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 10:01:38,236 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 10:01:38,236 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 10:01:38,382 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 10:01:38,382 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 10:01:38,619 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:38,619 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 10:01:38,784 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:38,784 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 10:01:38,934 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:38,934 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 10:01:39,090 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 10:01:39,090 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 10:01:39,236 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 10:01:39,236 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 10:01:39,378 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 10:01:39,378 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 10:01:39,531 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:39,531 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 10:01:39,687 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 10:01:39,687 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 10:01:39,841 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:39,841 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 10:01:39,991 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 10:01:39,991 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 10:01:40,132 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:40,132 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 10:01:40,296 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:40,297 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 10:01:40,425 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:40,425 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 10:01:40,558 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:40,558 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 10:01:40,757 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 10:01:40,757 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 10:01:40,888 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:40,888 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 10:01:41,027 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:41,027 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 10:01:41,155 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:41,155 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 10:01:41,298 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 10:01:41,298 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 10:01:41,299 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 10:24:14,959 - INFO - Starting crawl...
|
||||
2026-02-19 10:24:15,157 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 10:24:15,158 - INFO - Filtering reviews published after: 2026-02-18 03:24:15.158438 UTC
|
||||
2026-02-19 10:24:20,212 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-19 10:28:28,182 - INFO - Starting crawl...
|
||||
2026-02-19 10:45:37,099 - ERROR - Database connection failed.
|
||||
2026-02-19 11:24:20,238 - INFO - Starting crawl...
|
||||
2026-02-19 11:24:20,358 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 11:24:20,359 - INFO - Filtering reviews published after: 2026-02-18 04:24:20.359416 UTC
|
||||
2026-02-19 11:24:24,025 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-19 11:45:37,148 - INFO - Starting crawl...
|
||||
2026-02-19 11:45:37,245 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 11:45:37,246 - INFO - Filtering reviews published after: 2026-01-20 04:45:37.245685+00:00 UTC
|
||||
2026-02-19 11:45:37,246 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 11:45:42,171 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 11:45:42,172 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 11:45:42,326 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:42,326 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 11:45:42,490 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:42,490 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 11:45:42,641 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 11:45:42,642 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 11:45:42,790 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 11:45:42,790 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 11:45:42,931 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:42,932 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 11:45:43,073 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:43,073 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 11:45:43,248 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:43,248 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 11:45:43,411 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 11:45:43,411 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 11:45:43,565 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 11:45:43,565 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 11:45:43,713 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 11:45:43,713 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 11:45:43,851 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:43,851 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 11:45:43,993 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 11:45:43,993 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 11:45:44,151 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:44,151 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 11:45:44,289 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 11:45:44,289 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 11:45:44,427 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:44,427 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 11:45:44,569 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:44,569 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 11:45:44,710 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:44,710 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 11:45:44,843 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:44,843 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 11:45:45,002 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 11:45:45,002 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 11:45:45,136 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:45,137 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 11:45:45,264 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:45,264 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 11:45:45,400 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:45,400 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 11:45:45,533 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 11:45:45,533 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 12:24:24,049 - INFO - Starting crawl...
|
||||
2026-02-19 12:24:24,136 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 12:24:24,137 - INFO - Filtering reviews published after: 2026-02-18 05:24:24.137540 UTC
|
||||
2026-02-19 12:24:27,688 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-19 12:45:45,554 - INFO - Starting crawl...
|
||||
2026-02-19 12:45:45,641 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 12:45:45,642 - INFO - Filtering reviews published after: 2026-01-20 05:45:45.641787+00:00 UTC
|
||||
2026-02-19 12:45:45,642 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 12:45:45,899 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 12:45:45,900 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 12:45:46,045 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:46,046 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 12:45:46,202 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:46,202 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 12:45:46,354 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 12:45:46,354 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 12:45:46,498 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 12:45:46,498 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 12:45:46,629 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:46,629 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 12:45:46,779 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:46,779 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 12:45:46,922 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:46,923 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 12:45:47,083 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 12:45:47,083 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 12:45:47,233 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 12:45:47,233 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 12:45:47,390 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 12:45:47,390 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 12:45:47,556 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:47,556 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 12:45:47,700 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 12:45:47,700 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 12:45:47,892 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:47,892 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 12:45:48,093 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 12:45:48,093 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 12:45:48,228 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:48,228 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 12:45:48,374 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:48,374 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 12:45:48,519 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:48,519 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 12:45:48,650 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:48,650 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 12:45:48,826 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 12:45:48,826 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 12:45:48,956 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:48,956 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 12:45:49,093 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:49,093 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 12:45:49,226 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:49,226 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 12:45:49,378 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 12:45:49,380 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 13:24:27,720 - INFO - Starting crawl...
|
||||
2026-02-19 13:24:27,799 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 13:24:27,800 - INFO - Filtering reviews published after: 2026-02-18 06:24:27.800409 UTC
|
||||
2026-02-19 13:24:31,399 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-19 13:45:49,443 - INFO - Starting crawl...
|
||||
2026-02-19 13:45:49,573 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 13:45:49,575 - INFO - Filtering reviews published after: 2026-01-20 06:45:49.574094+00:00 UTC
|
||||
2026-02-19 13:45:49,575 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 13:45:49,884 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 13:45:49,884 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 13:45:50,045 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:50,045 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 13:45:50,344 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:50,344 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 13:45:50,498 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 13:45:50,498 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 13:45:50,670 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 13:45:50,670 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 13:45:50,890 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:50,890 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 13:45:51,034 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:51,034 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 13:45:51,169 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:51,169 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 13:45:51,339 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 13:45:51,339 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 13:45:51,485 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 13:45:51,485 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 13:45:51,640 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 13:45:51,640 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 13:45:51,775 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:51,775 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 13:45:52,105 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 13:45:52,105 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 13:45:52,302 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:52,302 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 13:45:52,460 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 13:45:52,460 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 13:45:52,607 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:52,607 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 13:45:52,749 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:52,749 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 13:45:52,907 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:52,907 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 13:45:53,047 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:53,048 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 13:45:53,191 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 13:45:53,191 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 13:45:53,333 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:53,334 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 13:45:53,472 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:53,472 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 13:45:53,611 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:53,611 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 13:45:53,757 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 13:45:53,758 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 14:24:31,424 - INFO - Starting crawl...
|
||||
2026-02-19 14:24:31,568 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 14:24:31,569 - INFO - Filtering reviews published after: 2026-02-18 07:24:31.569105 UTC
|
||||
2026-02-19 14:24:35,373 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-19 14:32:13,522 - INFO - Starting crawl...
|
||||
2026-02-19 14:32:13,556 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 14:32:13,556 - INFO - Filtering reviews published after: 2026-01-20 07:32:13.556406+00:00 UTC
|
||||
2026-02-19 14:32:13,556 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 14:32:13,764 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:13,764 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 14:32:13,909 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:13,909 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 14:32:14,056 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:14,057 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 14:32:14,204 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:14,204 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 14:32:14,381 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:14,381 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 14:32:14,517 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:14,517 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 14:32:14,659 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:14,659 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 14:32:14,790 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:14,791 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 14:32:14,943 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 14:32:14,943 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 14:32:15,088 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:15,088 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 14:32:15,232 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:15,232 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 14:32:15,366 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:15,366 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 14:32:15,512 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:15,512 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 14:32:15,665 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:15,665 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 14:32:15,815 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:15,815 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 14:32:15,960 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:15,961 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 14:32:16,102 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:16,102 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 14:32:16,242 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:16,242 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 14:32:16,373 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:16,373 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 14:32:16,533 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:16,533 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 14:32:16,673 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:16,673 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 14:32:16,812 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:16,812 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 14:32:16,951 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:16,951 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 14:32:17,080 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:17,080 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 14:32:17,080 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 14:32:23,740 - INFO - Starting crawl...
|
||||
2026-02-19 14:32:23,773 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 14:32:23,773 - INFO - Filtering reviews published after: 2026-01-20 07:32:23.773136+00:00 UTC
|
||||
2026-02-19 14:32:23,773 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 14:32:23,880 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:23,881 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 14:32:23,978 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:23,979 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 14:32:24,078 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:24,078 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 14:32:24,181 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:24,181 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 14:32:24,305 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:24,306 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 14:32:24,405 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:24,405 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 14:32:24,509 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:24,509 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 14:32:24,610 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:24,610 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 14:32:24,726 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 14:32:24,726 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 14:32:24,843 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:24,843 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 14:32:24,958 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:24,959 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 14:32:25,060 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:25,060 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 14:32:25,250 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:25,250 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 14:32:25,480 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:25,480 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 14:32:25,599 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:25,600 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 14:32:25,698 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:25,698 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 14:32:25,792 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:25,793 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 14:32:25,912 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:25,912 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 14:32:26,030 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:26,030 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 14:32:26,162 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:32:26,164 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 14:32:26,281 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:26,281 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 14:32:26,393 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:26,394 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 14:32:26,502 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:26,502 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 14:32:26,605 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:32:26,605 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 14:32:26,605 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 14:34:30,398 - INFO - Starting crawl...
|
||||
2026-02-19 14:34:30,434 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 14:34:30,434 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 14:34:30,584 - INFO - Updates: 1
|
||||
2026-02-19 14:34:30,585 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 14:34:30,742 - INFO - Updates: 1
|
||||
2026-02-19 14:34:30,742 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 14:34:30,898 - INFO - Updates: 1
|
||||
2026-02-19 14:34:30,898 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 14:34:31,042 - INFO - Updates: 1
|
||||
2026-02-19 14:34:31,042 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 14:34:31,210 - INFO - Updates: 1
|
||||
2026-02-19 14:34:31,211 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 14:34:31,374 - INFO - Updates: 1
|
||||
2026-02-19 14:34:31,374 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 14:34:31,528 - INFO - Updates: 1
|
||||
2026-02-19 14:34:31,528 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 14:34:31,684 - INFO - Updates: 1
|
||||
2026-02-19 14:34:31,684 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 14:34:31,832 - INFO - Updates: 1
|
||||
2026-02-19 14:34:31,832 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 14:34:31,992 - INFO - Updates: 1
|
||||
2026-02-19 14:34:31,992 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 14:34:32,141 - INFO - Updates: 1
|
||||
2026-02-19 14:34:32,141 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 14:34:32,292 - INFO - Updates: 1
|
||||
2026-02-19 14:34:32,293 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 14:34:32,442 - INFO - Updates: 1
|
||||
2026-02-19 14:34:32,442 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 14:34:32,601 - INFO - Updates: 1
|
||||
2026-02-19 14:34:32,602 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 14:34:32,748 - INFO - Updates: 1
|
||||
2026-02-19 14:34:32,748 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 14:34:32,904 - INFO - Updates: 1
|
||||
2026-02-19 14:34:32,904 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 14:34:33,077 - INFO - Updates: 1
|
||||
2026-02-19 14:34:33,078 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 14:34:33,238 - INFO - Updates: 1
|
||||
2026-02-19 14:34:33,238 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 14:34:33,386 - INFO - Updates: 1
|
||||
2026-02-19 14:34:33,386 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 14:34:33,541 - INFO - Updates: 1
|
||||
2026-02-19 14:34:33,541 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 14:34:33,690 - INFO - Updates: 1
|
||||
2026-02-19 14:34:33,690 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 14:34:33,834 - INFO - Updates: 1
|
||||
2026-02-19 14:34:33,834 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 14:34:34,000 - INFO - Updates: 1
|
||||
2026-02-19 14:34:34,000 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 14:34:34,153 - INFO - Updates: 1
|
||||
2026-02-19 14:34:34,153 - INFO - Crawl finished. Total reviews upserted: 120
|
||||
2026-02-19 14:34:34,154 - INFO - Crawler is running... Press Ctrl+C to stop.
|
||||
2026-02-19 14:45:53,788 - INFO - Starting crawl...
|
||||
2026-02-19 14:45:53,913 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 14:45:53,914 - INFO - Filtering reviews published after: 2026-01-20 07:45:53.913273+00:00 UTC
|
||||
2026-02-19 14:45:53,914 - INFO - Crawling Outlet: 1927
|
||||
2026-02-19 14:45:54,177 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:45:54,178 - INFO - Crawling Outlet: 1922
|
||||
2026-02-19 14:45:54,329 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:54,329 - INFO - Crawling Outlet: 1924
|
||||
2026-02-19 14:45:54,488 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:54,488 - INFO - Crawling Outlet: 1932
|
||||
2026-02-19 14:45:54,639 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:45:54,640 - INFO - Crawling Outlet: 1928
|
||||
2026-02-19 14:45:54,793 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:45:54,793 - INFO - Crawling Outlet: 1934
|
||||
2026-02-19 14:45:54,948 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:54,948 - INFO - Crawling Outlet: 2617
|
||||
2026-02-19 14:45:55,096 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:55,096 - INFO - Crawling Outlet: 1925
|
||||
2026-02-19 14:45:55,237 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:55,237 - INFO - Crawling Outlet: 1935
|
||||
2026-02-19 14:45:55,732 - INFO - Updates: 1 (Skipped 3 old reviews)
|
||||
2026-02-19 14:45:55,732 - INFO - Crawling Outlet: 1938
|
||||
2026-02-19 14:45:55,906 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:45:55,906 - INFO - Crawling Outlet: 1933
|
||||
2026-02-19 14:45:56,060 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:45:56,060 - INFO - Crawling Outlet: 2175
|
||||
2026-02-19 14:45:56,239 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:56,239 - INFO - Crawling Outlet: 1931
|
||||
2026-02-19 14:45:56,381 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:45:56,382 - INFO - Crawling Outlet: 1936
|
||||
2026-02-19 14:45:56,584 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:56,584 - INFO - Crawling Outlet: 1939
|
||||
2026-02-19 14:45:56,730 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:45:56,730 - INFO - Crawling Outlet: 2877
|
||||
2026-02-19 14:45:56,865 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:56,865 - INFO - Crawling Outlet: 1929
|
||||
2026-02-19 14:45:57,042 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:57,042 - INFO - Crawling Outlet: 1627
|
||||
2026-02-19 14:45:57,238 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:57,238 - INFO - Crawling Outlet: 1926
|
||||
2026-02-19 14:45:57,375 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:57,375 - INFO - Crawling Outlet: 2946
|
||||
2026-02-19 14:45:57,535 - INFO - Updates: 1 (Skipped 4 old reviews)
|
||||
2026-02-19 14:45:57,535 - INFO - Crawling Outlet: 1930
|
||||
2026-02-19 14:45:57,681 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:57,681 - INFO - Crawling Outlet: 2878
|
||||
2026-02-19 14:45:57,807 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:57,807 - INFO - Crawling Outlet: 1923
|
||||
2026-02-19 14:45:58,057 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:58,057 - INFO - Crawling Outlet: 2308
|
||||
2026-02-19 14:45:58,192 - INFO - Updates: -1 (Skipped 5 old reviews)
|
||||
2026-02-19 14:45:58,194 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 10
|
||||
2026-02-19 15:24:35,404 - INFO - Starting crawl...
|
||||
2026-02-19 15:24:35,530 - INFO - Found 24 outlets to crawl.
|
||||
2026-02-19 15:24:35,531 - INFO - Filtering reviews published after: 2026-02-18 08:24:35.531455 UTC
|
||||
2026-02-19 15:24:39,199 - INFO - Crawl finished. Total reviews upserted (new/updated in last 24h): 0
|
||||
2026-02-25 16:37:45,742 - INFO - Starting crawler process (Google Business Profile API)...
|
||||
2026-02-25 16:38:07,810 - INFO - Starting crawler process (Google Business Profile API)...
|
||||
2026-02-25 16:38:08,464 - INFO - Found 4 outlets with Google Business ID.
|
||||
2026-02-25 16:38:08,465 - INFO - Filtering reviews published after: 2025-11-27 09:38:08.465067+00:00 UTC
|
||||
2026-02-25 16:38:08,465 - INFO - Crawling Outlet: 1622 - Barata
|
||||
2026-02-25 16:38:08,864 - INFO - Updates: 0 (Skipped 0 older than 3 months)
|
||||
2026-02-25 16:38:08,865 - INFO - Crawling Outlet: 1923 - Pepelegi
|
||||
2026-02-25 16:38:09,720 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 16:38:09,721 - INFO - Crawling Outlet: 1930 - Merr
|
||||
2026-02-25 16:38:10,456 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 16:38:10,457 - INFO - Crawling Outlet: 1924 - Ponti
|
||||
2026-02-25 16:38:11,258 - INFO - Updates: 34 (Skipped 16 older than 3 months)
|
||||
2026-02-25 16:38:11,259 - INFO - Crawl finished. Total reviews upserted: 131
|
||||
2026-02-25 17:03:01,445 - INFO - Starting crawler process (Google Business Profile API)...
|
||||
2026-02-25 17:03:02,785 - INFO - Found 19 outlets with Google Business ID.
|
||||
2026-02-25 17:03:02,786 - INFO - Filtering reviews published after: 2025-11-27 10:03:02.786184+00:00 UTC
|
||||
2026-02-25 17:03:02,786 - INFO - Crawling Outlet: 1622 - Barata
|
||||
2026-02-25 17:03:06,537 - INFO - Updates: 19 (Skipped 31 older than 3 months)
|
||||
2026-02-25 17:03:06,537 - INFO - Crawling Outlet: 1932 - Royal Plaza
|
||||
2026-02-25 17:03:15,636 - INFO - Updates: 295 (Skipped 5 older than 3 months)
|
||||
2026-02-25 17:03:15,637 - INFO - Crawling Outlet: 1925 - Tropodo
|
||||
2026-02-25 17:03:16,504 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:03:16,504 - INFO - Crawling Outlet: 1928 - Kusuma Bangsa
|
||||
2026-02-25 17:03:17,343 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:03:17,343 - INFO - Crawling Outlet: 1934 - Sepanjang
|
||||
2026-02-25 17:03:18,292 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:03:18,293 - INFO - Crawling Outlet: 1923 - Pepelegi
|
||||
2026-02-25 17:03:19,067 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:03:19,068 - INFO - Crawling Outlet: 1938 - Rungkut
|
||||
2026-02-25 17:03:19,777 - INFO - Updates: 17 (Skipped 33 older than 3 months)
|
||||
2026-02-25 17:03:19,777 - INFO - Crawling Outlet: 1935 - Wiyung
|
||||
2026-02-25 17:03:20,586 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:03:20,586 - INFO - Crawling Outlet: 1929 - Imam Bonjol
|
||||
2026-02-25 17:03:21,410 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:03:21,410 - INFO - Crawling Outlet: 1931 - Rest Area Sidoarjo
|
||||
2026-02-25 17:03:22,139 - INFO - Updates: 19 (Skipped 31 older than 3 months)
|
||||
2026-02-25 17:03:22,140 - INFO - Crawling Outlet: 1939 - Kediri
|
||||
2026-02-25 17:03:22,928 - INFO - Updates: 13 (Skipped 37 older than 3 months)
|
||||
2026-02-25 17:03:22,929 - INFO - Crawling Outlet: 1927 - Gayungsari
|
||||
2026-02-25 17:03:23,681 - INFO - Updates: 22 (Skipped 28 older than 3 months)
|
||||
2026-02-25 17:03:23,682 - INFO - Crawling Outlet: 1627 - BG Junction
|
||||
2026-02-25 17:03:24,360 - INFO - Updates: 4 (Skipped 46 older than 3 months)
|
||||
2026-02-25 17:03:24,360 - INFO - Crawling Outlet: 2946 - Puri Surya Jaya
|
||||
2026-02-25 17:03:27,562 - INFO - Updates: 133 (Skipped 17 older than 3 months)
|
||||
2026-02-25 17:03:27,562 - INFO - Crawling Outlet: 1926 - Darmo
|
||||
2026-02-25 17:03:28,347 - INFO - Updates: 24 (Skipped 26 older than 3 months)
|
||||
2026-02-25 17:03:28,348 - INFO - Crawling Outlet: 1936 - Wahidin
|
||||
2026-02-25 17:03:29,071 - INFO - Updates: 6 (Skipped 44 older than 3 months)
|
||||
2026-02-25 17:03:29,071 - INFO - Crawling Outlet: 1922 - Plaza Surabaya Delta
|
||||
2026-02-25 17:03:29,784 - INFO - Updates: 8 (Skipped 42 older than 3 months)
|
||||
2026-02-25 17:03:29,784 - INFO - Crawling Outlet: 1930 - Merr
|
||||
2026-02-25 17:03:30,534 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:03:30,534 - INFO - Crawling Outlet: 1924 - Ponti
|
||||
2026-02-25 17:03:32,104 - INFO - Updates: 34 (Skipped 16 older than 3 months)
|
||||
2026-02-25 17:03:32,105 - INFO - Crawl finished. Total reviews upserted: 933
|
||||
2026-02-25 17:04:03,455 - INFO - Starting crawler process (Google Business Profile API)...
|
||||
2026-02-25 17:04:04,113 - INFO - Found 19 outlets with Google Business ID.
|
||||
2026-02-25 17:04:04,113 - INFO - Filtering reviews published after: 2025-11-27 10:04:04.113425+00:00 UTC
|
||||
2026-02-25 17:04:04,113 - INFO - Crawling Outlet: 1622 - Barata
|
||||
2026-02-25 17:04:04,846 - INFO - Updates: 19 (Skipped 31 older than 3 months)
|
||||
2026-02-25 17:04:04,847 - INFO - Crawling Outlet: 1932 - Royal Plaza
|
||||
2026-02-25 17:04:09,499 - INFO - Updates: 295 (Skipped 5 older than 3 months)
|
||||
2026-02-25 17:04:09,499 - INFO - Crawling Outlet: 1925 - Tropodo
|
||||
2026-02-25 17:04:10,236 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:04:10,237 - INFO - Crawling Outlet: 1928 - Kusuma Bangsa
|
||||
2026-02-25 17:04:11,045 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:04:11,045 - INFO - Crawling Outlet: 1934 - Sepanjang
|
||||
2026-02-25 17:04:11,855 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:04:11,855 - INFO - Crawling Outlet: 1923 - Pepelegi
|
||||
2026-02-25 17:04:12,630 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:04:12,631 - INFO - Crawling Outlet: 1938 - Rungkut
|
||||
2026-02-25 17:04:14,151 - INFO - Updates: 17 (Skipped 33 older than 3 months)
|
||||
2026-02-25 17:04:14,151 - INFO - Crawling Outlet: 1935 - Wiyung
|
||||
2026-02-25 17:04:14,905 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:04:14,905 - INFO - Crawling Outlet: 1929 - Imam Bonjol
|
||||
2026-02-25 17:04:15,681 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:04:15,681 - INFO - Crawling Outlet: 1931 - Rest Area Sidoarjo
|
||||
2026-02-25 17:04:16,373 - INFO - Updates: 19 (Skipped 31 older than 3 months)
|
||||
2026-02-25 17:04:16,373 - INFO - Crawling Outlet: 1939 - Kediri
|
||||
2026-02-25 17:04:17,044 - INFO - Updates: 13 (Skipped 37 older than 3 months)
|
||||
2026-02-25 17:04:17,044 - INFO - Crawling Outlet: 1927 - Gayungsari
|
||||
2026-02-25 17:04:17,757 - INFO - Updates: 22 (Skipped 28 older than 3 months)
|
||||
2026-02-25 17:04:17,757 - INFO - Crawling Outlet: 1627 - BG Junction
|
||||
2026-02-25 17:04:18,461 - INFO - Updates: 4 (Skipped 46 older than 3 months)
|
||||
2026-02-25 17:04:18,462 - INFO - Crawling Outlet: 2946 - Puri Surya Jaya
|
||||
2026-02-25 17:04:20,759 - INFO - Updates: 133 (Skipped 17 older than 3 months)
|
||||
2026-02-25 17:04:20,760 - INFO - Crawling Outlet: 1926 - Darmo
|
||||
2026-02-25 17:04:21,535 - INFO - Updates: 24 (Skipped 26 older than 3 months)
|
||||
2026-02-25 17:04:21,535 - INFO - Crawling Outlet: 1936 - Wahidin
|
||||
2026-02-25 17:04:22,217 - INFO - Updates: 6 (Skipped 44 older than 3 months)
|
||||
2026-02-25 17:04:22,217 - INFO - Crawling Outlet: 1922 - Plaza Surabaya Delta
|
||||
2026-02-25 17:04:22,886 - INFO - Updates: 8 (Skipped 42 older than 3 months)
|
||||
2026-02-25 17:04:22,886 - INFO - Crawling Outlet: 1930 - Merr
|
||||
2026-02-25 17:04:23,617 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:04:23,617 - INFO - Crawling Outlet: 1924 - Ponti
|
||||
2026-02-25 17:04:24,407 - INFO - Updates: 34 (Skipped 16 older than 3 months)
|
||||
2026-02-25 17:04:24,407 - INFO - Crawl finished. Total reviews upserted: 933
|
||||
2026-02-25 17:07:56,973 - INFO - Starting crawler process (Google Business Profile API)...
|
||||
2026-02-25 17:07:57,435 - INFO - Found 19 outlets with Google Business ID.
|
||||
2026-02-25 17:07:57,435 - INFO - Filtering reviews published after: 2025-11-27 10:07:57.435234+00:00 UTC
|
||||
2026-02-25 17:07:57,435 - INFO - Crawling Outlet: 1622 - Barata
|
||||
2026-02-25 17:07:58,154 - INFO - Updates: 19 (Skipped 31 older than 3 months)
|
||||
2026-02-25 17:07:58,154 - INFO - Crawling Outlet: 1932 - Royal Plaza
|
||||
2026-02-25 17:08:02,838 - INFO - Updates: 295 (Skipped 5 older than 3 months)
|
||||
2026-02-25 17:08:02,838 - INFO - Crawling Outlet: 1925 - Tropodo
|
||||
2026-02-25 17:08:03,595 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:08:03,595 - INFO - Crawling Outlet: 1928 - Kusuma Bangsa
|
||||
2026-02-25 17:08:04,385 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:08:04,386 - INFO - Crawling Outlet: 1934 - Sepanjang
|
||||
2026-02-25 17:08:05,207 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:08:05,207 - INFO - Crawling Outlet: 1923 - Pepelegi
|
||||
2026-02-25 17:08:05,977 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:08:05,978 - INFO - Crawling Outlet: 1938 - Rungkut
|
||||
2026-02-25 17:08:06,707 - INFO - Updates: 17 (Skipped 33 older than 3 months)
|
||||
2026-02-25 17:08:06,707 - INFO - Crawling Outlet: 1935 - Wiyung
|
||||
2026-02-25 17:08:07,435 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:08:07,435 - INFO - Crawling Outlet: 1929 - Imam Bonjol
|
||||
2026-02-25 17:08:08,235 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:08:08,235 - INFO - Crawling Outlet: 1931 - Rest Area Sidoarjo
|
||||
2026-02-25 17:08:08,941 - INFO - Updates: 19 (Skipped 31 older than 3 months)
|
||||
2026-02-25 17:08:08,942 - INFO - Crawling Outlet: 1939 - Kediri
|
||||
2026-02-25 17:08:09,839 - INFO - Updates: 13 (Skipped 37 older than 3 months)
|
||||
2026-02-25 17:08:09,840 - INFO - Crawling Outlet: 1927 - Gayungsari
|
||||
2026-02-25 17:08:10,554 - INFO - Updates: 22 (Skipped 28 older than 3 months)
|
||||
2026-02-25 17:08:10,555 - INFO - Crawling Outlet: 1627 - BG Junction
|
||||
2026-02-25 17:08:13,207 - INFO - Updates: 4 (Skipped 46 older than 3 months)
|
||||
2026-02-25 17:08:13,208 - INFO - Crawling Outlet: 2946 - Puri Surya Jaya
|
||||
2026-02-25 17:08:15,450 - INFO - Updates: 133 (Skipped 17 older than 3 months)
|
||||
2026-02-25 17:08:15,450 - INFO - Crawling Outlet: 1926 - Darmo
|
||||
2026-02-25 17:08:16,957 - INFO - Updates: 24 (Skipped 26 older than 3 months)
|
||||
2026-02-25 17:08:16,957 - INFO - Crawling Outlet: 1936 - Wahidin
|
||||
2026-02-25 17:08:17,690 - INFO - Updates: 6 (Skipped 44 older than 3 months)
|
||||
2026-02-25 17:08:17,690 - INFO - Crawling Outlet: 1922 - Plaza Surabaya Delta
|
||||
2026-02-25 17:08:18,435 - INFO - Updates: 8 (Skipped 42 older than 3 months)
|
||||
2026-02-25 17:08:18,435 - INFO - Crawling Outlet: 1930 - Merr
|
||||
2026-02-25 17:08:19,188 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:08:19,188 - INFO - Crawling Outlet: 1924 - Ponti
|
||||
2026-02-25 17:08:19,958 - INFO - Updates: 34 (Skipped 16 older than 3 months)
|
||||
2026-02-25 17:08:19,958 - INFO - Crawl finished. Total reviews upserted: 933
|
||||
2026-02-25 17:09:18,894 - INFO - Starting crawler process (Google Business Profile API)...
|
||||
2026-02-25 17:09:20,405 - INFO - Found 25 outlets with Google Business ID.
|
||||
2026-02-25 17:09:20,406 - INFO - Filtering reviews published after: 2025-11-27 10:09:20.406084+00:00 UTC
|
||||
2026-02-25 17:09:20,406 - INFO - Crawling Outlet: 1622 - Barata
|
||||
2026-02-25 17:09:21,133 - INFO - Updates: 19 (Skipped 31 older than 3 months)
|
||||
2026-02-25 17:09:21,133 - INFO - Crawling Outlet: 1932 - Royal Plaza
|
||||
2026-02-25 17:09:26,653 - INFO - Updates: 295 (Skipped 5 older than 3 months)
|
||||
2026-02-25 17:09:26,653 - INFO - Crawling Outlet: 1925 - Tropodo
|
||||
2026-02-25 17:09:27,424 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:09:27,424 - INFO - Crawling Outlet: 1928 - Kusuma Bangsa
|
||||
2026-02-25 17:09:28,207 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:09:28,208 - INFO - Crawling Outlet: 1934 - Sepanjang
|
||||
2026-02-25 17:09:28,977 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:09:28,977 - INFO - Crawling Outlet: 1923 - Pepelegi
|
||||
2026-02-25 17:09:29,761 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:09:29,761 - INFO - Crawling Outlet: 1938 - Rungkut
|
||||
2026-02-25 17:09:31,233 - INFO - Updates: 17 (Skipped 33 older than 3 months)
|
||||
2026-02-25 17:09:31,233 - INFO - Crawling Outlet: 1933 - Citraland
|
||||
2026-02-25 17:09:32,046 - INFO - Updates: 11 (Skipped 39 older than 3 months)
|
||||
2026-02-25 17:09:32,046 - INFO - Crawling Outlet: 1935 - Wiyung
|
||||
2026-02-25 17:09:33,597 - INFO - Updates: 48 (Skipped 2 older than 3 months)
|
||||
2026-02-25 17:09:33,597 - INFO - Crawling Outlet: 1929 - Imam Bonjol
|
||||
2026-02-25 17:09:34,324 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:09:34,325 - INFO - Crawling Outlet: 1931 - Rest Area Sidoarjo
|
||||
2026-02-25 17:09:35,006 - INFO - Updates: 19 (Skipped 31 older than 3 months)
|
||||
2026-02-25 17:09:35,006 - INFO - Crawling Outlet: 2175 - Pakuwon Trade Center
|
||||
2026-02-25 17:09:35,704 - INFO - Updates: 1 (Skipped 49 older than 3 months)
|
||||
2026-02-25 17:09:35,705 - INFO - Crawling Outlet: 2877 - AEON
|
||||
2026-02-25 17:09:36,413 - INFO - Updates: 5 (Skipped 44 older than 3 months)
|
||||
2026-02-25 17:09:36,413 - INFO - Crawling Outlet: 1939 - Kediri
|
||||
2026-02-25 17:09:38,388 - INFO - Updates: 13 (Skipped 37 older than 3 months)
|
||||
2026-02-25 17:09:38,389 - INFO - Crawling Outlet: 1927 - Gayungsari
|
||||
2026-02-25 17:09:39,111 - INFO - Updates: 22 (Skipped 28 older than 3 months)
|
||||
2026-02-25 17:09:39,112 - INFO - Crawling Outlet: 1627 - BG Junction
|
||||
2026-02-25 17:09:40,592 - INFO - Updates: 4 (Skipped 46 older than 3 months)
|
||||
2026-02-25 17:09:40,593 - INFO - Crawling Outlet: 2878 - Karawaci
|
||||
2026-02-25 17:09:41,239 - INFO - Updates: 1 (Skipped 49 older than 3 months)
|
||||
2026-02-25 17:09:41,240 - INFO - Crawling Outlet: 2946 - Puri Surya Jaya
|
||||
2026-02-25 17:09:43,484 - INFO - Updates: 133 (Skipped 17 older than 3 months)
|
||||
2026-02-25 17:09:43,485 - INFO - Crawling Outlet: 1926 - Darmo
|
||||
2026-02-25 17:09:44,203 - INFO - Updates: 24 (Skipped 26 older than 3 months)
|
||||
2026-02-25 17:09:44,204 - INFO - Crawling Outlet: 1936 - Wahidin
|
||||
2026-02-25 17:09:44,898 - INFO - Updates: 6 (Skipped 44 older than 3 months)
|
||||
2026-02-25 17:09:44,899 - INFO - Crawling Outlet: 2617 - Pakuwon City Mall
|
||||
2026-02-25 17:09:45,618 - INFO - Updates: 6 (Skipped 44 older than 3 months)
|
||||
2026-02-25 17:09:45,619 - INFO - Crawling Outlet: 1922 - Plaza Surabaya Delta
|
||||
2026-02-25 17:09:46,488 - INFO - Updates: 8 (Skipped 42 older than 3 months)
|
||||
2026-02-25 17:09:46,489 - INFO - Crawling Outlet: 2308 - Plaza Festival
|
||||
2026-02-25 17:09:47,231 - INFO - Updates: 14 (Skipped 36 older than 3 months)
|
||||
2026-02-25 17:09:47,231 - INFO - Crawling Outlet: 1930 - Merr
|
||||
2026-02-25 17:09:47,938 - INFO - Updates: 49 (Skipped 1 older than 3 months)
|
||||
2026-02-25 17:09:47,938 - INFO - Crawling Outlet: 1924 - Ponti
|
||||
2026-02-25 17:09:48,660 - INFO - Updates: 34 (Skipped 16 older than 3 months)
|
||||
2026-02-25 17:09:48,660 - INFO - Crawl finished. Total reviews upserted: 971
|
||||
212
crawler.py
Normal file
212
crawler.py
Normal file
@ -0,0 +1,212 @@
|
||||
import os
|
||||
import time
|
||||
import requests
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from dotenv import load_dotenv
|
||||
from google.oauth2.credentials import Credentials
|
||||
from google.auth.transport.requests import Request
|
||||
from database import get_db_connection, create_table
|
||||
|
||||
# Setup basic logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(levelname)s - %(message)s',
|
||||
handlers=[
|
||||
logging.FileHandler("crawler.log"),
|
||||
logging.StreamHandler()
|
||||
]
|
||||
)
|
||||
|
||||
def fetch_outlets(conn):
|
||||
"""
|
||||
Fetch all outlets that have a google_business_id.
|
||||
"""
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT google_business_id, popcorn_code, outlet_name
|
||||
FROM master_outlet
|
||||
WHERE google_business_id IS NOT NULL
|
||||
AND google_business_id != '';
|
||||
""")
|
||||
rows = cur.fetchall()
|
||||
return [{"google_business_id": str(row[0]).strip(), "outlet_code": str(row[1]), "outlet_name": str(row[2])} for row in rows]
|
||||
|
||||
def get_oauth_headers():
|
||||
if not os.path.exists('token.json'):
|
||||
raise FileNotFoundError("token.json not found. Run authorize.py first.")
|
||||
|
||||
creds = Credentials.from_authorized_user_file('token.json')
|
||||
if creds.expired and creds.refresh_token:
|
||||
creds.refresh(Request())
|
||||
# Save refreshed credentials
|
||||
with open('token.json', 'w') as token:
|
||||
token.write(creds.to_json())
|
||||
|
||||
return {
|
||||
'Authorization': f'Bearer {creds.token}',
|
||||
'Content-Type': 'application/json',
|
||||
'Accept-Language': 'id'
|
||||
}
|
||||
|
||||
def get_account_name(headers):
|
||||
account_url = "https://mybusinessaccountmanagement.googleapis.com/v1/accounts"
|
||||
res = requests.get(account_url, headers=headers)
|
||||
res.raise_for_status()
|
||||
accounts = res.json().get('accounts', [])
|
||||
if not accounts:
|
||||
raise ValueError("No Google Business accounts found.")
|
||||
return accounts[0]['name']
|
||||
|
||||
def crawl_reviews():
|
||||
"""
|
||||
Main crawling function using Google Business Profile API.
|
||||
"""
|
||||
conn = get_db_connection()
|
||||
if conn is None:
|
||||
return
|
||||
|
||||
try:
|
||||
# Create table if not exists (schema from database.py should be up-to-date)
|
||||
create_table()
|
||||
|
||||
headers = get_oauth_headers()
|
||||
account_name = get_account_name(headers)
|
||||
|
||||
outlets = fetch_outlets(conn)
|
||||
logging.info(f"Found {len(outlets)} outlets with Google Business ID.")
|
||||
|
||||
# Define cutoff time (3 months ~ 90 days ago)
|
||||
cutoff_time = datetime.now(timezone.utc) - timedelta(days=90)
|
||||
logging.info(f"Filtering reviews published after: {cutoff_time} UTC")
|
||||
|
||||
total_upserted = 0
|
||||
|
||||
for outlet in outlets:
|
||||
location_id = outlet["google_business_id"]
|
||||
outlet_code = outlet["outlet_code"]
|
||||
outlet_name = outlet["outlet_name"]
|
||||
|
||||
logging.info(f"Crawling Outlet: {outlet_code} - {outlet_name}")
|
||||
|
||||
# https://mybusiness.googleapis.com/v4/{account_name}/locations/{location_id}/reviews
|
||||
base_url = f"https://mybusiness.googleapis.com/v4/{account_name}/locations/{location_id}/reviews"
|
||||
|
||||
page_token = None
|
||||
outlet_upserted = 0
|
||||
outlet_skipped = 0
|
||||
stop_pagination = False
|
||||
|
||||
while not stop_pagination:
|
||||
params = {}
|
||||
if page_token:
|
||||
params['pageToken'] = page_token
|
||||
|
||||
try:
|
||||
response = requests.get(base_url, headers=headers, params=params)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
reviews = data.get('reviews', [])
|
||||
if not reviews:
|
||||
break # No more reviews
|
||||
|
||||
with conn.cursor() as cur:
|
||||
for review in reviews:
|
||||
# Convert createTime
|
||||
# e.g., "2023-10-25T14:48:00Z" (sometimes microseconds are present)
|
||||
create_time_str = review.get("createTime")
|
||||
publish_time = None
|
||||
if create_time_str:
|
||||
if "." in create_time_str:
|
||||
clean_time = create_time_str.split('.')[0]
|
||||
else:
|
||||
clean_time = create_time_str.replace("Z", "")
|
||||
publish_time = datetime.strptime(clean_time, "%Y-%m-%dT%H:%M:%S").replace(tzinfo=timezone.utc)
|
||||
|
||||
# Check 3-month filter
|
||||
if publish_time and publish_time < cutoff_time:
|
||||
outlet_skipped += 1
|
||||
# Business Profile API usually returns reviews newest first
|
||||
# so we can optionally stop here, but let's process all in this page just in case
|
||||
stop_pagination = True
|
||||
continue
|
||||
|
||||
# Extract fields
|
||||
review_id = review.get("reviewId")
|
||||
comment = review.get("comment", "")
|
||||
|
||||
# Remove Google Translate prefix/suffix if present
|
||||
if "(Original)" in comment:
|
||||
parts = comment.split("(Original)")
|
||||
if len(parts) > 1:
|
||||
comment = parts[-1].strip()
|
||||
elif "(Diterjemahkan oleh Google)" in comment:
|
||||
parts = comment.split("\n\n(Diterjemahkan oleh Google)")
|
||||
if len(parts) > 1:
|
||||
comment = parts[0].strip()
|
||||
else:
|
||||
parts = comment.split("(Diterjemahkan oleh Google)")
|
||||
comment = parts[0].strip()
|
||||
|
||||
author_name = review.get("reviewer", {}).get("displayName", "Unknown")
|
||||
rating_str = review.get("starRating") # e.g., "FIVE"
|
||||
|
||||
# Convert rating "FIVE" to 5
|
||||
rating_map = {"ONE": 1, "TWO": 2, "THREE": 3, "FOUR": 4, "FIVE": 5}
|
||||
rating = rating_map.get(rating_str, 0)
|
||||
|
||||
language = None # The API might not return language explicitly
|
||||
|
||||
# Upsert logic
|
||||
# We still try to fill place_id if possible, but we don't have it here.
|
||||
# So we will insert NULL for place_id and let google_business_id identify it later if needed.
|
||||
# Oh wait, the table definition has place_id. Let's still use place_id if we want,
|
||||
# but we didn't select it. Let's just insert NULL for place_id (if nullable).
|
||||
# Let's check schema: place_id was PRIMARY key component?
|
||||
# NO, review_id is PRIMARY KEY. place_id is a normal column.
|
||||
|
||||
cur.execute("""
|
||||
INSERT INTO google_review (
|
||||
review_id, place_id, original_text, author_display_name, publish_time, rating, outlet_code, language
|
||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s)
|
||||
ON CONFLICT (review_id) DO UPDATE SET
|
||||
original_text = EXCLUDED.original_text,
|
||||
author_display_name = EXCLUDED.author_display_name,
|
||||
rating = EXCLUDED.rating,
|
||||
language = EXCLUDED.language,
|
||||
publish_time = EXCLUDED.publish_time,
|
||||
updated_at = CURRENT_TIMESTAMP;
|
||||
""", (review_id, None, comment, author_name, publish_time, rating, outlet_code, language))
|
||||
|
||||
outlet_upserted += 1
|
||||
total_upserted += 1
|
||||
|
||||
conn.commit()
|
||||
|
||||
page_token = data.get('nextPageToken')
|
||||
if not page_token:
|
||||
break # No more pages
|
||||
|
||||
except Exception as e:
|
||||
logging.error(f" Error fetching reviews for {outlet_code}: {e}")
|
||||
if response is not None and response.status_code != 200:
|
||||
logging.error(f" API Response Error: {response.text}")
|
||||
conn.rollback()
|
||||
break # Stop pagination on error
|
||||
|
||||
logging.info(f" Updates: {outlet_upserted} (Skipped {outlet_skipped} older than 3 months)")
|
||||
|
||||
logging.info(f"Crawl finished. Total reviews upserted: {total_upserted}")
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
def main():
|
||||
# Only run once, immediately
|
||||
logging.info("Starting crawler process (Google Business Profile API)...")
|
||||
crawl_reviews()
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
53
database.py
Normal file
53
database.py
Normal file
@ -0,0 +1,53 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from psycopg2 import sql
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
def get_db_connection():
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD")
|
||||
)
|
||||
return conn
|
||||
except Exception as e:
|
||||
print(f"Error connecting to database: {e}")
|
||||
return None
|
||||
|
||||
def create_table():
|
||||
conn = get_db_connection()
|
||||
if not conn:
|
||||
return
|
||||
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
# Create table with requested fields
|
||||
cur.execute("""
|
||||
CREATE TABLE IF NOT EXISTS google_review (
|
||||
id SERIAL PRIMARY KEY,
|
||||
review_id TEXT UNIQUE,
|
||||
place_id TEXT,
|
||||
original_text TEXT,
|
||||
author_display_name TEXT,
|
||||
publish_time TIMESTAMP,
|
||||
rating INTEGER,
|
||||
outlet_code VARCHAR(255),
|
||||
language VARCHAR(10),
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
CONSTRAINT fk_outlet_code FOREIGN KEY (outlet_code) REFERENCES master_outlet(popcorn_code) ON DELETE SET NULL
|
||||
);
|
||||
""")
|
||||
conn.commit()
|
||||
print("Table 'google_review' verified/created successfully.")
|
||||
except Exception as e:
|
||||
print(f"Error creating table: {e}")
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
if __name__ == "__main__":
|
||||
create_table()
|
||||
54
fetch_reviews.py
Normal file
54
fetch_reviews.py
Normal file
@ -0,0 +1,54 @@
|
||||
import os
|
||||
import requests
|
||||
import json
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
API_KEY = os.getenv("GOOGLE_API_KEY")
|
||||
PLACE_ID = os.getenv("PLACE_ID")
|
||||
|
||||
def fetch_place_reviews(place_id, api_key):
|
||||
url = f"https://places.googleapis.com/v1/places/{place_id}"
|
||||
|
||||
params = {
|
||||
"languageCode": "id"
|
||||
}
|
||||
|
||||
headers = {
|
||||
"Content-Type": "application/json",
|
||||
"X-Goog-Api-Key": api_key,
|
||||
"X-Goog-FieldMask": "id,displayName,reviews"
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.get(url, headers=headers, params=params)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
if "reviews" in data:
|
||||
for review in data["reviews"]:
|
||||
lang = review.get("originalText", {}).get("languageCode")
|
||||
if lang and lang != "en":
|
||||
print(f"Found non-English review! Language: {lang}")
|
||||
print(json.dumps(review, indent=2))
|
||||
|
||||
return data
|
||||
except requests.exceptions.HTTPError as http_err:
|
||||
print(f"HTTP error occurred: {http_err}")
|
||||
except Exception as err:
|
||||
print(f"An error occurred: {err}")
|
||||
return None
|
||||
|
||||
if __name__ == "__main__":
|
||||
print(f"Fetching reviews for Place ID: {PLACE_ID}...")
|
||||
data = fetch_place_reviews(PLACE_ID, API_KEY)
|
||||
|
||||
if data:
|
||||
print("Successfully fetched data!")
|
||||
print("\n--- SAMPLE RESPONSE STARTS ---")
|
||||
print(json.dumps(data, indent=2))
|
||||
print("--- SAMPLE RESPONSE ENDS ---\n")
|
||||
else:
|
||||
print("Failed to fetch data.")
|
||||
10
inspect_business_ids.py
Normal file
10
inspect_business_ids.py
Normal file
@ -0,0 +1,10 @@
|
||||
from database import get_db_connection
|
||||
|
||||
conn = get_db_connection()
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("SELECT outlet_name, popcorn_code, google_business_id FROM master_outlet WHERE google_business_id IS NOT NULL LIMIT 10;")
|
||||
rows = cur.fetchall()
|
||||
print("Outlets with Google Business ID:")
|
||||
for r in rows:
|
||||
print(r)
|
||||
conn.close()
|
||||
28
inspect_db.py
Normal file
28
inspect_db.py
Normal file
@ -0,0 +1,28 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
def inspect_master_outlet():
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD")
|
||||
)
|
||||
with conn.cursor() as cur:
|
||||
# Check columns in master_outlet using cursor.description
|
||||
cur.execute("SELECT * FROM master_outlet LIMIT 0;")
|
||||
colnames = [desc[0] for desc in cur.description]
|
||||
print("Columns in master_outlet:", colnames)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error inspecting table: {e}")
|
||||
finally:
|
||||
if conn: conn.close()
|
||||
|
||||
if __name__ == "__main__":
|
||||
inspect_master_outlet()
|
||||
27
inspect_db_simple.py
Normal file
27
inspect_db_simple.py
Normal file
@ -0,0 +1,27 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
def check_columns():
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD"),
|
||||
connect_timeout=10
|
||||
)
|
||||
print("Connected.")
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("SELECT * FROM master_outlet LIMIT 0;")
|
||||
colnames = [desc[0] for desc in cur.description]
|
||||
print("Columns:", colnames)
|
||||
conn.close()
|
||||
except Exception as e:
|
||||
print(f"Error: {e}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
check_columns()
|
||||
72
list_locations.py
Normal file
72
list_locations.py
Normal file
@ -0,0 +1,72 @@
|
||||
import os.path
|
||||
import json
|
||||
from google.auth.transport.requests import Request
|
||||
from google.oauth2.credentials import Credentials
|
||||
from google_auth_oauthlib.flow import InstalledAppFlow
|
||||
from googleapiclient.discovery import build
|
||||
|
||||
SCOPES = [
|
||||
'https://www.googleapis.com/auth/business.manage'
|
||||
]
|
||||
|
||||
def get_creds():
|
||||
creds = None
|
||||
if os.path.exists('token.json'):
|
||||
creds = Credentials.from_authorized_user_file('token.json', SCOPES)
|
||||
return creds
|
||||
|
||||
def list_locations():
|
||||
creds = get_creds()
|
||||
if not creds:
|
||||
print("No token.json found. Please run authorize.py first.")
|
||||
return
|
||||
|
||||
try:
|
||||
# 1. Get the Account ID
|
||||
# "mybusinessaccountmanagement" service is needed to get the account
|
||||
service_account = build('mybusinessaccountmanagement', 'v1', credentials=creds)
|
||||
accounts_result = service_account.accounts().list().execute()
|
||||
accounts = accounts_result.get('accounts', [])
|
||||
|
||||
if not accounts:
|
||||
print("No Google Business accounts found.")
|
||||
return
|
||||
|
||||
# Use the first account (usually there's only one or the user wants the main one)
|
||||
# You might want to list them if there are multiple.
|
||||
account = accounts[0]
|
||||
account_name = account['name'] # Format: accounts/{accountId}
|
||||
print(f"Using Account: {account['accountName']} ({account_name})")
|
||||
|
||||
# 2. List Locations for that Account
|
||||
# "mybusinessbusinessinformation" service is used for locations
|
||||
service_locations = build('mybusinessbusinessinformation', 'v1', credentials=creds)
|
||||
|
||||
print("\nFetching locations...\n")
|
||||
print(f"{'Store Code':<15} | {'Location ID':<40} | {'Outlet Name'}")
|
||||
print("-" * 80)
|
||||
|
||||
request = service_locations.accounts().locations().list(
|
||||
parent=account_name,
|
||||
readMask="name,title,storeCode"
|
||||
)
|
||||
|
||||
while request is not None:
|
||||
response = request.execute()
|
||||
locations = response.get('locations', [])
|
||||
|
||||
for loc in locations:
|
||||
# name is "locations/{locationId}"
|
||||
location_id = loc.get('name', '').split('/')[-1]
|
||||
title = loc.get('title', 'N/A')
|
||||
store_code = loc.get('storeCode', 'N/A')
|
||||
|
||||
print(f"{store_code:<15} | {location_id:<40} | {title}")
|
||||
|
||||
request = service_locations.accounts().locations().list_next(request, response)
|
||||
|
||||
except Exception as e:
|
||||
print(f"An error occurred: {e}")
|
||||
|
||||
if __name__ == '__main__':
|
||||
list_locations()
|
||||
55
migrate_db.py
Normal file
55
migrate_db.py
Normal file
@ -0,0 +1,55 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
def migrate_db():
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD")
|
||||
)
|
||||
with conn.cursor() as cur:
|
||||
# Add outlet_code column
|
||||
cur.execute("""
|
||||
ALTER TABLE google_review
|
||||
ADD COLUMN IF NOT EXISTS outlet_code VARCHAR(255);
|
||||
""")
|
||||
print("Added 'outlet_code' column.")
|
||||
|
||||
# Add foreign key constraint
|
||||
# Note: master_outlet(popcorn_code) must be unique. The inspection showed it has a unique constraint.
|
||||
try:
|
||||
cur.execute("""
|
||||
ALTER TABLE google_review
|
||||
ADD CONSTRAINT fk_outlet_code
|
||||
FOREIGN KEY (outlet_code)
|
||||
REFERENCES master_outlet(popcorn_code)
|
||||
ON DELETE SET NULL;
|
||||
""")
|
||||
print("Added foreign key constraint for 'outlet_code'.")
|
||||
except psycopg2.errors.DuplicateObject:
|
||||
print("Foreign key constraint already exists.")
|
||||
conn.rollback()
|
||||
|
||||
# Add language column
|
||||
cur.execute("""
|
||||
ALTER TABLE google_review
|
||||
ADD COLUMN IF NOT EXISTS language VARCHAR(10);
|
||||
""")
|
||||
print("Added 'language' column.")
|
||||
|
||||
conn.commit()
|
||||
print("Migration completed successfully.")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error migrating database: {e}")
|
||||
finally:
|
||||
if conn: conn.close()
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_db()
|
||||
32
migrate_updated_at.py
Normal file
32
migrate_updated_at.py
Normal file
@ -0,0 +1,32 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
def migrate_updated_at():
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD")
|
||||
)
|
||||
with conn.cursor() as cur:
|
||||
# Add updated_at column
|
||||
cur.execute("""
|
||||
ALTER TABLE google_review
|
||||
ADD COLUMN IF NOT EXISTS updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP;
|
||||
""")
|
||||
print("Added 'updated_at' column.")
|
||||
conn.commit()
|
||||
print("Migration completed successfully.")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error migrating database: {e}")
|
||||
finally:
|
||||
if conn: conn.close()
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate_updated_at()
|
||||
7
requirements.txt
Normal file
7
requirements.txt
Normal file
@ -0,0 +1,7 @@
|
||||
requests
|
||||
python-dotenv
|
||||
psycopg2-binary
|
||||
schedule
|
||||
google-auth-oauthlib
|
||||
google-auth-httplib2
|
||||
google-api-python-client
|
||||
29
revert_test_data.py
Normal file
29
revert_test_data.py
Normal file
@ -0,0 +1,29 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
PLACE_ID = "ChIJl_j1m6v71y0RzcEBmBCj8tg"
|
||||
|
||||
def revert_test_data():
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD")
|
||||
)
|
||||
with conn.cursor() as cur:
|
||||
# Revert the change
|
||||
cur.execute("UPDATE master_outlet SET place_id = NULL WHERE place_id = %s;", (PLACE_ID,))
|
||||
conn.commit()
|
||||
print(f"Reverted place_id for {cur.rowcount} outlet(s).")
|
||||
|
||||
conn.close()
|
||||
except Exception as e:
|
||||
print(f"Error checking data: {e}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
revert_test_data()
|
||||
11
run_hourly.sh
Executable file
11
run_hourly.sh
Executable file
@ -0,0 +1,11 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Activating virtual environment
|
||||
source venv/bin/activate
|
||||
|
||||
echo "Starting hourly crawler in background..."
|
||||
nohup python schedule_crawler.py > /dev/null 2>&1 &
|
||||
|
||||
echo "Crawler is now running in the background."
|
||||
echo "Check scheduler.log and crawler.log for output."
|
||||
echo "To stop the crawler later, run: pkill -f schedule_crawler.py"
|
||||
45
schedule_crawler.py
Normal file
45
schedule_crawler.py
Normal file
@ -0,0 +1,45 @@
|
||||
import schedule
|
||||
import time
|
||||
import subprocess
|
||||
import logging
|
||||
from datetime import datetime
|
||||
|
||||
# Setup basic logging for the scheduler
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - SCHEDULER - %(message)s',
|
||||
handlers=[
|
||||
logging.FileHandler("scheduler.log"),
|
||||
logging.StreamHandler()
|
||||
]
|
||||
)
|
||||
|
||||
def job():
|
||||
logging.info("Starting hourly crawler execution...")
|
||||
try:
|
||||
# Run crawler.py as a separate process to ensure clean memory state
|
||||
subprocess.run(["./venv/bin/python", "crawler.py"], check=True)
|
||||
logging.info("Crawler execution finished successfully.")
|
||||
except subprocess.CalledProcessError as e:
|
||||
logging.error(f"Crawler failed with exit code: {e.returncode}")
|
||||
except Exception as e:
|
||||
logging.error(f"Failed to run crawler: {e}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
logging.info("--- Crawler Scheduler Started ---")
|
||||
|
||||
# Run once immediately on startup
|
||||
job()
|
||||
|
||||
# Schedule to run every 1 hour
|
||||
schedule.every(1).hours.do(job)
|
||||
|
||||
logging.info("Scheduled to run every 1 hour. Waiting in background...")
|
||||
|
||||
# Keep the script running
|
||||
try:
|
||||
while True:
|
||||
schedule.run_pending()
|
||||
time.sleep(60) # Check every minute
|
||||
except KeyboardInterrupt:
|
||||
logging.info("Scheduler stopped by user.")
|
||||
4
scheduler.log
Normal file
4
scheduler.log
Normal file
@ -0,0 +1,4 @@
|
||||
2026-02-25 17:04:03,177 - SCHEDULER - --- Crawler Scheduler Started ---
|
||||
2026-02-25 17:04:03,177 - SCHEDULER - Starting hourly crawler execution...
|
||||
2026-02-25 17:04:24,449 - SCHEDULER - Crawler execution finished successfully.
|
||||
2026-02-25 17:04:24,449 - SCHEDULER - Scheduled to run every 1 hour. Waiting in background...
|
||||
47
setup_test_data.py
Normal file
47
setup_test_data.py
Normal file
@ -0,0 +1,47 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
PLACE_ID = "ChIJl_j1m6v71y0RzcEBmBCj8tg"
|
||||
|
||||
def setup_test_data():
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD")
|
||||
)
|
||||
with conn.cursor() as cur:
|
||||
# Check for Ngagel
|
||||
cur.execute("SELECT id, outlet_name, popcorn_code FROM master_outlet WHERE outlet_name ILIKE '%Ngagel%';")
|
||||
rows = cur.fetchall()
|
||||
|
||||
target_id = None
|
||||
if rows:
|
||||
print(f"Found Ngagel outlet: {rows[0]}")
|
||||
target_id = rows[0][0]
|
||||
else:
|
||||
print("Ngagel outlet not found. Using the first available outlet.")
|
||||
cur.execute("SELECT id, outlet_name, popcorn_code FROM master_outlet LIMIT 1;")
|
||||
rows = cur.fetchall()
|
||||
if rows:
|
||||
print(f"Using outlet: {rows[0]}")
|
||||
target_id = rows[0][0]
|
||||
|
||||
if target_id:
|
||||
cur.execute("UPDATE master_outlet SET place_id = %s WHERE id = %s;", (PLACE_ID, target_id))
|
||||
conn.commit()
|
||||
print(f"Updated outlet {target_id} with Place ID: {PLACE_ID}")
|
||||
else:
|
||||
print("No outlets found in master_outlet table.")
|
||||
|
||||
conn.close()
|
||||
except Exception as e:
|
||||
print(f"Error setting up test data: {e}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
setup_test_data()
|
||||
49
test_all_api.py
Normal file
49
test_all_api.py
Normal file
@ -0,0 +1,49 @@
|
||||
import os
|
||||
import requests
|
||||
from google.oauth2.credentials import Credentials
|
||||
from google.auth.transport.requests import Request
|
||||
from database import get_db_connection
|
||||
|
||||
def test_fetch_reviews():
|
||||
creds = Credentials.from_authorized_user_file('token.json')
|
||||
if creds.expired and creds.refresh_token:
|
||||
creds.refresh(Request())
|
||||
|
||||
headers = {
|
||||
'Authorization': f'Bearer {creds.token}'
|
||||
}
|
||||
|
||||
account_url = "https://mybusinessaccountmanagement.googleapis.com/v1/accounts"
|
||||
res = requests.get(account_url, headers=headers)
|
||||
if res.status_code != 200:
|
||||
print("Error getting accounts:", res.text)
|
||||
return
|
||||
|
||||
accounts = res.json().get('accounts', [])
|
||||
if not accounts:
|
||||
print("No accounts found.")
|
||||
return
|
||||
|
||||
account_name = accounts[0]['name']
|
||||
print(f"Using account: {account_name}")
|
||||
|
||||
conn = get_db_connection()
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("SELECT outlet_name, popcorn_code, google_business_id FROM master_outlet WHERE google_business_id IS NOT NULL;")
|
||||
rows = cur.fetchall()
|
||||
for row in rows:
|
||||
name, code, loc_id = row
|
||||
location_name = f"locations/{loc_id.strip()}"
|
||||
url = f"https://mybusiness.googleapis.com/v4/{account_name}/{location_name}/reviews"
|
||||
response = requests.get(url, headers=headers)
|
||||
print(f"Outlet: {name} (Code: {code}, ID: {loc_id.strip()}) -> Status: {response.status_code}")
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
reviews = data.get('reviews', [])
|
||||
print(f" Found {len(reviews)} reviews.")
|
||||
else:
|
||||
print(f" Error: {response.text[:200]}")
|
||||
conn.close()
|
||||
|
||||
if __name__ == '__main__':
|
||||
test_fetch_reviews()
|
||||
60
test_api.py
Normal file
60
test_api.py
Normal file
@ -0,0 +1,60 @@
|
||||
import os
|
||||
import requests
|
||||
from google.oauth2.credentials import Credentials
|
||||
from google.auth.transport.requests import Request
|
||||
|
||||
def test_fetch_reviews():
|
||||
creds = Credentials.from_authorized_user_file('token.json')
|
||||
|
||||
# Refresh token if needed
|
||||
if creds.expired and creds.refresh_token:
|
||||
creds.refresh(Request())
|
||||
|
||||
headers = {
|
||||
'Authorization': f'Bearer {creds.token}'
|
||||
}
|
||||
|
||||
# We need an account ID. Let's fetch it first.
|
||||
account_url = "https://mybusinessaccountmanagement.googleapis.com/v1/accounts"
|
||||
res = requests.get(account_url, headers=headers)
|
||||
|
||||
if res.status_code != 200:
|
||||
print("Error getting accounts:", res.text)
|
||||
return
|
||||
|
||||
accounts = res.json().get('accounts', [])
|
||||
if not accounts:
|
||||
print("No accounts found.")
|
||||
return
|
||||
|
||||
account_name = accounts[0]['name'] # format: accounts/1234
|
||||
print(f"Using account: {account_name}")
|
||||
|
||||
# Test with one of the user's location IDs
|
||||
# e.g. location_id = "6212010560520025465"
|
||||
location_id = "6212010560520025465"
|
||||
location_name = f"locations/{location_id}"
|
||||
|
||||
# The reviews endpoint is mybusiness.googleapis.com/v4/accounts/{accountId}/locations/{locationId}/reviews
|
||||
url = f"https://mybusiness.googleapis.com/v4/{account_name}/{location_name}/reviews"
|
||||
|
||||
print(f"Fetching reviews from: {url}")
|
||||
response = requests.get(url, headers=headers)
|
||||
|
||||
print(f"Status: {response.status_code}")
|
||||
if response.status_code != 200:
|
||||
print(f"Error: {response.text}")
|
||||
data = response.json()
|
||||
|
||||
reviews = data.get('reviews', [])
|
||||
print(f"Found {len(reviews)} reviews.")
|
||||
if reviews:
|
||||
print("Sample review:")
|
||||
review = reviews[0]
|
||||
print(f" ID: {review.get('reviewId')}")
|
||||
print(f" Rating: {review.get('starRating')}")
|
||||
print(f" Comment: {review.get('comment', '')[:100]}...")
|
||||
print(f" Time: {review.get('createTime')}")
|
||||
|
||||
if __name__ == '__main__':
|
||||
test_fetch_reviews()
|
||||
24
test_db.py
Normal file
24
test_db.py
Normal file
@ -0,0 +1,24 @@
|
||||
import os
|
||||
import psycopg2
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
def test_conn():
|
||||
print("Testing connection to:", os.getenv("DB_HOST"))
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv("DB_HOST"),
|
||||
port=os.getenv("DB_PORT"),
|
||||
database=os.getenv("DB_NAME"),
|
||||
user=os.getenv("DB_USER"),
|
||||
password=os.getenv("DB_PASSWORD"),
|
||||
connect_timeout=5
|
||||
)
|
||||
print("Connection successful!")
|
||||
conn.close()
|
||||
except Exception as e:
|
||||
print(f"Connection failed: {e}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_conn()
|
||||
1
token.json
Normal file
1
token.json
Normal file
@ -0,0 +1 @@
|
||||
{"token": "ya29.a0ATkoCc7vMAnUzeNJOS58XOXup4ELIWeSD_sjoQ1Px40jW2-fcfZW8ed4KdXz9eNYRfvBbxh7cIxPZ7SmsS8eFgnAvzyk06L76HqIrcgnz8Ukpa5YGlEyVozd5MXO6TMKpn6wcMkxI9QH6rqCF0rwlx1cII1lolXf6qNTT_GjqfrWVcSYJD4YLDiyO3F06PFlNdQjJKxeaCgYKAQ4SARUSFQHGX2MiZbK-cs6GQdJB0mNRyyNZXA0207", "refresh_token": "1//0g25i7BBB1dGPCgYIARAAGBASNwF-L9IrJM0yRF25DdKtrNrfA41BGH2xipGw_WwMSakD2zgJQH_LoxVqanuFgzq1FxZlmqvR9gQ", "token_uri": "https://oauth2.googleapis.com/token", "client_id": "804823156361-3mk31f6a14r6np9usmm2mo5qnjl9lk00.apps.googleusercontent.com", "client_secret": "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", "scopes": ["https://www.googleapis.com/auth/business.manage"], "universe_domain": "googleapis.com", "account": "", "expiry": "2026-02-25T10:38:07.053555Z"}
|
||||
247
venv/bin/Activate.ps1
Normal file
247
venv/bin/Activate.ps1
Normal file
@ -0,0 +1,247 @@
|
||||
<#
|
||||
.Synopsis
|
||||
Activate a Python virtual environment for the current PowerShell session.
|
||||
|
||||
.Description
|
||||
Pushes the python executable for a virtual environment to the front of the
|
||||
$Env:PATH environment variable and sets the prompt to signify that you are
|
||||
in a Python virtual environment. Makes use of the command line switches as
|
||||
well as the `pyvenv.cfg` file values present in the virtual environment.
|
||||
|
||||
.Parameter VenvDir
|
||||
Path to the directory that contains the virtual environment to activate. The
|
||||
default value for this is the parent of the directory that the Activate.ps1
|
||||
script is located within.
|
||||
|
||||
.Parameter Prompt
|
||||
The prompt prefix to display when this virtual environment is activated. By
|
||||
default, this prompt is the name of the virtual environment folder (VenvDir)
|
||||
surrounded by parentheses and followed by a single space (ie. '(.venv) ').
|
||||
|
||||
.Example
|
||||
Activate.ps1
|
||||
Activates the Python virtual environment that contains the Activate.ps1 script.
|
||||
|
||||
.Example
|
||||
Activate.ps1 -Verbose
|
||||
Activates the Python virtual environment that contains the Activate.ps1 script,
|
||||
and shows extra information about the activation as it executes.
|
||||
|
||||
.Example
|
||||
Activate.ps1 -VenvDir C:\Users\MyUser\Common\.venv
|
||||
Activates the Python virtual environment located in the specified location.
|
||||
|
||||
.Example
|
||||
Activate.ps1 -Prompt "MyPython"
|
||||
Activates the Python virtual environment that contains the Activate.ps1 script,
|
||||
and prefixes the current prompt with the specified string (surrounded in
|
||||
parentheses) while the virtual environment is active.
|
||||
|
||||
.Notes
|
||||
On Windows, it may be required to enable this Activate.ps1 script by setting the
|
||||
execution policy for the user. You can do this by issuing the following PowerShell
|
||||
command:
|
||||
|
||||
PS C:\> Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
|
||||
|
||||
For more information on Execution Policies:
|
||||
https://go.microsoft.com/fwlink/?LinkID=135170
|
||||
|
||||
#>
|
||||
Param(
|
||||
[Parameter(Mandatory = $false)]
|
||||
[String]
|
||||
$VenvDir,
|
||||
[Parameter(Mandatory = $false)]
|
||||
[String]
|
||||
$Prompt
|
||||
)
|
||||
|
||||
<# Function declarations --------------------------------------------------- #>
|
||||
|
||||
<#
|
||||
.Synopsis
|
||||
Remove all shell session elements added by the Activate script, including the
|
||||
addition of the virtual environment's Python executable from the beginning of
|
||||
the PATH variable.
|
||||
|
||||
.Parameter NonDestructive
|
||||
If present, do not remove this function from the global namespace for the
|
||||
session.
|
||||
|
||||
#>
|
||||
function global:deactivate ([switch]$NonDestructive) {
|
||||
# Revert to original values
|
||||
|
||||
# The prior prompt:
|
||||
if (Test-Path -Path Function:_OLD_VIRTUAL_PROMPT) {
|
||||
Copy-Item -Path Function:_OLD_VIRTUAL_PROMPT -Destination Function:prompt
|
||||
Remove-Item -Path Function:_OLD_VIRTUAL_PROMPT
|
||||
}
|
||||
|
||||
# The prior PYTHONHOME:
|
||||
if (Test-Path -Path Env:_OLD_VIRTUAL_PYTHONHOME) {
|
||||
Copy-Item -Path Env:_OLD_VIRTUAL_PYTHONHOME -Destination Env:PYTHONHOME
|
||||
Remove-Item -Path Env:_OLD_VIRTUAL_PYTHONHOME
|
||||
}
|
||||
|
||||
# The prior PATH:
|
||||
if (Test-Path -Path Env:_OLD_VIRTUAL_PATH) {
|
||||
Copy-Item -Path Env:_OLD_VIRTUAL_PATH -Destination Env:PATH
|
||||
Remove-Item -Path Env:_OLD_VIRTUAL_PATH
|
||||
}
|
||||
|
||||
# Just remove the VIRTUAL_ENV altogether:
|
||||
if (Test-Path -Path Env:VIRTUAL_ENV) {
|
||||
Remove-Item -Path env:VIRTUAL_ENV
|
||||
}
|
||||
|
||||
# Just remove VIRTUAL_ENV_PROMPT altogether.
|
||||
if (Test-Path -Path Env:VIRTUAL_ENV_PROMPT) {
|
||||
Remove-Item -Path env:VIRTUAL_ENV_PROMPT
|
||||
}
|
||||
|
||||
# Just remove the _PYTHON_VENV_PROMPT_PREFIX altogether:
|
||||
if (Get-Variable -Name "_PYTHON_VENV_PROMPT_PREFIX" -ErrorAction SilentlyContinue) {
|
||||
Remove-Variable -Name _PYTHON_VENV_PROMPT_PREFIX -Scope Global -Force
|
||||
}
|
||||
|
||||
# Leave deactivate function in the global namespace if requested:
|
||||
if (-not $NonDestructive) {
|
||||
Remove-Item -Path function:deactivate
|
||||
}
|
||||
}
|
||||
|
||||
<#
|
||||
.Description
|
||||
Get-PyVenvConfig parses the values from the pyvenv.cfg file located in the
|
||||
given folder, and returns them in a map.
|
||||
|
||||
For each line in the pyvenv.cfg file, if that line can be parsed into exactly
|
||||
two strings separated by `=` (with any amount of whitespace surrounding the =)
|
||||
then it is considered a `key = value` line. The left hand string is the key,
|
||||
the right hand is the value.
|
||||
|
||||
If the value starts with a `'` or a `"` then the first and last character is
|
||||
stripped from the value before being captured.
|
||||
|
||||
.Parameter ConfigDir
|
||||
Path to the directory that contains the `pyvenv.cfg` file.
|
||||
#>
|
||||
function Get-PyVenvConfig(
|
||||
[String]
|
||||
$ConfigDir
|
||||
) {
|
||||
Write-Verbose "Given ConfigDir=$ConfigDir, obtain values in pyvenv.cfg"
|
||||
|
||||
# Ensure the file exists, and issue a warning if it doesn't (but still allow the function to continue).
|
||||
$pyvenvConfigPath = Join-Path -Resolve -Path $ConfigDir -ChildPath 'pyvenv.cfg' -ErrorAction Continue
|
||||
|
||||
# An empty map will be returned if no config file is found.
|
||||
$pyvenvConfig = @{ }
|
||||
|
||||
if ($pyvenvConfigPath) {
|
||||
|
||||
Write-Verbose "File exists, parse `key = value` lines"
|
||||
$pyvenvConfigContent = Get-Content -Path $pyvenvConfigPath
|
||||
|
||||
$pyvenvConfigContent | ForEach-Object {
|
||||
$keyval = $PSItem -split "\s*=\s*", 2
|
||||
if ($keyval[0] -and $keyval[1]) {
|
||||
$val = $keyval[1]
|
||||
|
||||
# Remove extraneous quotations around a string value.
|
||||
if ("'""".Contains($val.Substring(0, 1))) {
|
||||
$val = $val.Substring(1, $val.Length - 2)
|
||||
}
|
||||
|
||||
$pyvenvConfig[$keyval[0]] = $val
|
||||
Write-Verbose "Adding Key: '$($keyval[0])'='$val'"
|
||||
}
|
||||
}
|
||||
}
|
||||
return $pyvenvConfig
|
||||
}
|
||||
|
||||
|
||||
<# Begin Activate script --------------------------------------------------- #>
|
||||
|
||||
# Determine the containing directory of this script
|
||||
$VenvExecPath = Split-Path -Parent $MyInvocation.MyCommand.Definition
|
||||
$VenvExecDir = Get-Item -Path $VenvExecPath
|
||||
|
||||
Write-Verbose "Activation script is located in path: '$VenvExecPath'"
|
||||
Write-Verbose "VenvExecDir Fullname: '$($VenvExecDir.FullName)"
|
||||
Write-Verbose "VenvExecDir Name: '$($VenvExecDir.Name)"
|
||||
|
||||
# Set values required in priority: CmdLine, ConfigFile, Default
|
||||
# First, get the location of the virtual environment, it might not be
|
||||
# VenvExecDir if specified on the command line.
|
||||
if ($VenvDir) {
|
||||
Write-Verbose "VenvDir given as parameter, using '$VenvDir' to determine values"
|
||||
}
|
||||
else {
|
||||
Write-Verbose "VenvDir not given as a parameter, using parent directory name as VenvDir."
|
||||
$VenvDir = $VenvExecDir.Parent.FullName.TrimEnd("\\/")
|
||||
Write-Verbose "VenvDir=$VenvDir"
|
||||
}
|
||||
|
||||
# Next, read the `pyvenv.cfg` file to determine any required value such
|
||||
# as `prompt`.
|
||||
$pyvenvCfg = Get-PyVenvConfig -ConfigDir $VenvDir
|
||||
|
||||
# Next, set the prompt from the command line, or the config file, or
|
||||
# just use the name of the virtual environment folder.
|
||||
if ($Prompt) {
|
||||
Write-Verbose "Prompt specified as argument, using '$Prompt'"
|
||||
}
|
||||
else {
|
||||
Write-Verbose "Prompt not specified as argument to script, checking pyvenv.cfg value"
|
||||
if ($pyvenvCfg -and $pyvenvCfg['prompt']) {
|
||||
Write-Verbose " Setting based on value in pyvenv.cfg='$($pyvenvCfg['prompt'])'"
|
||||
$Prompt = $pyvenvCfg['prompt'];
|
||||
}
|
||||
else {
|
||||
Write-Verbose " Setting prompt based on parent's directory's name. (Is the directory name passed to venv module when creating the virtual environment)"
|
||||
Write-Verbose " Got leaf-name of $VenvDir='$(Split-Path -Path $venvDir -Leaf)'"
|
||||
$Prompt = Split-Path -Path $venvDir -Leaf
|
||||
}
|
||||
}
|
||||
|
||||
Write-Verbose "Prompt = '$Prompt'"
|
||||
Write-Verbose "VenvDir='$VenvDir'"
|
||||
|
||||
# Deactivate any currently active virtual environment, but leave the
|
||||
# deactivate function in place.
|
||||
deactivate -nondestructive
|
||||
|
||||
# Now set the environment variable VIRTUAL_ENV, used by many tools to determine
|
||||
# that there is an activated venv.
|
||||
$env:VIRTUAL_ENV = $VenvDir
|
||||
|
||||
if (-not $Env:VIRTUAL_ENV_DISABLE_PROMPT) {
|
||||
|
||||
Write-Verbose "Setting prompt to '$Prompt'"
|
||||
|
||||
# Set the prompt to include the env name
|
||||
# Make sure _OLD_VIRTUAL_PROMPT is global
|
||||
function global:_OLD_VIRTUAL_PROMPT { "" }
|
||||
Copy-Item -Path function:prompt -Destination function:_OLD_VIRTUAL_PROMPT
|
||||
New-Variable -Name _PYTHON_VENV_PROMPT_PREFIX -Description "Python virtual environment prompt prefix" -Scope Global -Option ReadOnly -Visibility Public -Value $Prompt
|
||||
|
||||
function global:prompt {
|
||||
Write-Host -NoNewline -ForegroundColor Green "($_PYTHON_VENV_PROMPT_PREFIX) "
|
||||
_OLD_VIRTUAL_PROMPT
|
||||
}
|
||||
$env:VIRTUAL_ENV_PROMPT = $Prompt
|
||||
}
|
||||
|
||||
# Clear PYTHONHOME
|
||||
if (Test-Path -Path Env:PYTHONHOME) {
|
||||
Copy-Item -Path Env:PYTHONHOME -Destination Env:_OLD_VIRTUAL_PYTHONHOME
|
||||
Remove-Item -Path Env:PYTHONHOME
|
||||
}
|
||||
|
||||
# Add the venv to the PATH
|
||||
Copy-Item -Path Env:PATH -Destination Env:_OLD_VIRTUAL_PATH
|
||||
$Env:PATH = "$VenvExecDir$([System.IO.Path]::PathSeparator)$Env:PATH"
|
||||
70
venv/bin/activate
Normal file
70
venv/bin/activate
Normal file
@ -0,0 +1,70 @@
|
||||
# This file must be used with "source bin/activate" *from bash*
|
||||
# You cannot run it directly
|
||||
|
||||
deactivate () {
|
||||
# reset old environment variables
|
||||
if [ -n "${_OLD_VIRTUAL_PATH:-}" ] ; then
|
||||
PATH="${_OLD_VIRTUAL_PATH:-}"
|
||||
export PATH
|
||||
unset _OLD_VIRTUAL_PATH
|
||||
fi
|
||||
if [ -n "${_OLD_VIRTUAL_PYTHONHOME:-}" ] ; then
|
||||
PYTHONHOME="${_OLD_VIRTUAL_PYTHONHOME:-}"
|
||||
export PYTHONHOME
|
||||
unset _OLD_VIRTUAL_PYTHONHOME
|
||||
fi
|
||||
|
||||
# Call hash to forget past commands. Without forgetting
|
||||
# past commands the $PATH changes we made may not be respected
|
||||
hash -r 2> /dev/null
|
||||
|
||||
if [ -n "${_OLD_VIRTUAL_PS1:-}" ] ; then
|
||||
PS1="${_OLD_VIRTUAL_PS1:-}"
|
||||
export PS1
|
||||
unset _OLD_VIRTUAL_PS1
|
||||
fi
|
||||
|
||||
unset VIRTUAL_ENV
|
||||
unset VIRTUAL_ENV_PROMPT
|
||||
if [ ! "${1:-}" = "nondestructive" ] ; then
|
||||
# Self destruct!
|
||||
unset -f deactivate
|
||||
fi
|
||||
}
|
||||
|
||||
# unset irrelevant variables
|
||||
deactivate nondestructive
|
||||
|
||||
# on Windows, a path can contain colons and backslashes and has to be converted:
|
||||
if [ "${OSTYPE:-}" = "cygwin" ] || [ "${OSTYPE:-}" = "msys" ] ; then
|
||||
# transform D:\path\to\venv to /d/path/to/venv on MSYS
|
||||
# and to /cygdrive/d/path/to/venv on Cygwin
|
||||
export VIRTUAL_ENV=$(cygpath /home/suherdy/Pythoncode/google_map_review/venv)
|
||||
else
|
||||
# use the path as-is
|
||||
export VIRTUAL_ENV=/home/suherdy/Pythoncode/google_map_review/venv
|
||||
fi
|
||||
|
||||
_OLD_VIRTUAL_PATH="$PATH"
|
||||
PATH="$VIRTUAL_ENV/"bin":$PATH"
|
||||
export PATH
|
||||
|
||||
# unset PYTHONHOME if set
|
||||
# this will fail if PYTHONHOME is set to the empty string (which is bad anyway)
|
||||
# could use `if (set -u; : $PYTHONHOME) ;` in bash
|
||||
if [ -n "${PYTHONHOME:-}" ] ; then
|
||||
_OLD_VIRTUAL_PYTHONHOME="${PYTHONHOME:-}"
|
||||
unset PYTHONHOME
|
||||
fi
|
||||
|
||||
if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT:-}" ] ; then
|
||||
_OLD_VIRTUAL_PS1="${PS1:-}"
|
||||
PS1='(venv) '"${PS1:-}"
|
||||
export PS1
|
||||
VIRTUAL_ENV_PROMPT='(venv) '
|
||||
export VIRTUAL_ENV_PROMPT
|
||||
fi
|
||||
|
||||
# Call hash to forget past commands. Without forgetting
|
||||
# past commands the $PATH changes we made may not be respected
|
||||
hash -r 2> /dev/null
|
||||
27
venv/bin/activate.csh
Normal file
27
venv/bin/activate.csh
Normal file
@ -0,0 +1,27 @@
|
||||
# This file must be used with "source bin/activate.csh" *from csh*.
|
||||
# You cannot run it directly.
|
||||
|
||||
# Created by Davide Di Blasi <davidedb@gmail.com>.
|
||||
# Ported to Python 3.3 venv by Andrew Svetlov <andrew.svetlov@gmail.com>
|
||||
|
||||
alias deactivate 'test $?_OLD_VIRTUAL_PATH != 0 && setenv PATH "$_OLD_VIRTUAL_PATH" && unset _OLD_VIRTUAL_PATH; rehash; test $?_OLD_VIRTUAL_PROMPT != 0 && set prompt="$_OLD_VIRTUAL_PROMPT" && unset _OLD_VIRTUAL_PROMPT; unsetenv VIRTUAL_ENV; unsetenv VIRTUAL_ENV_PROMPT; test "\!:*" != "nondestructive" && unalias deactivate'
|
||||
|
||||
# Unset irrelevant variables.
|
||||
deactivate nondestructive
|
||||
|
||||
setenv VIRTUAL_ENV /home/suherdy/Pythoncode/google_map_review/venv
|
||||
|
||||
set _OLD_VIRTUAL_PATH="$PATH"
|
||||
setenv PATH "$VIRTUAL_ENV/"bin":$PATH"
|
||||
|
||||
|
||||
set _OLD_VIRTUAL_PROMPT="$prompt"
|
||||
|
||||
if (! "$?VIRTUAL_ENV_DISABLE_PROMPT") then
|
||||
set prompt = '(venv) '"$prompt"
|
||||
setenv VIRTUAL_ENV_PROMPT '(venv) '
|
||||
endif
|
||||
|
||||
alias pydoc python -m pydoc
|
||||
|
||||
rehash
|
||||
69
venv/bin/activate.fish
Normal file
69
venv/bin/activate.fish
Normal file
@ -0,0 +1,69 @@
|
||||
# This file must be used with "source <venv>/bin/activate.fish" *from fish*
|
||||
# (https://fishshell.com/). You cannot run it directly.
|
||||
|
||||
function deactivate -d "Exit virtual environment and return to normal shell environment"
|
||||
# reset old environment variables
|
||||
if test -n "$_OLD_VIRTUAL_PATH"
|
||||
set -gx PATH $_OLD_VIRTUAL_PATH
|
||||
set -e _OLD_VIRTUAL_PATH
|
||||
end
|
||||
if test -n "$_OLD_VIRTUAL_PYTHONHOME"
|
||||
set -gx PYTHONHOME $_OLD_VIRTUAL_PYTHONHOME
|
||||
set -e _OLD_VIRTUAL_PYTHONHOME
|
||||
end
|
||||
|
||||
if test -n "$_OLD_FISH_PROMPT_OVERRIDE"
|
||||
set -e _OLD_FISH_PROMPT_OVERRIDE
|
||||
# prevents error when using nested fish instances (Issue #93858)
|
||||
if functions -q _old_fish_prompt
|
||||
functions -e fish_prompt
|
||||
functions -c _old_fish_prompt fish_prompt
|
||||
functions -e _old_fish_prompt
|
||||
end
|
||||
end
|
||||
|
||||
set -e VIRTUAL_ENV
|
||||
set -e VIRTUAL_ENV_PROMPT
|
||||
if test "$argv[1]" != "nondestructive"
|
||||
# Self-destruct!
|
||||
functions -e deactivate
|
||||
end
|
||||
end
|
||||
|
||||
# Unset irrelevant variables.
|
||||
deactivate nondestructive
|
||||
|
||||
set -gx VIRTUAL_ENV /home/suherdy/Pythoncode/google_map_review/venv
|
||||
|
||||
set -gx _OLD_VIRTUAL_PATH $PATH
|
||||
set -gx PATH "$VIRTUAL_ENV/"bin $PATH
|
||||
|
||||
# Unset PYTHONHOME if set.
|
||||
if set -q PYTHONHOME
|
||||
set -gx _OLD_VIRTUAL_PYTHONHOME $PYTHONHOME
|
||||
set -e PYTHONHOME
|
||||
end
|
||||
|
||||
if test -z "$VIRTUAL_ENV_DISABLE_PROMPT"
|
||||
# fish uses a function instead of an env var to generate the prompt.
|
||||
|
||||
# Save the current fish_prompt function as the function _old_fish_prompt.
|
||||
functions -c fish_prompt _old_fish_prompt
|
||||
|
||||
# With the original prompt function renamed, we can override with our own.
|
||||
function fish_prompt
|
||||
# Save the return status of the last command.
|
||||
set -l old_status $status
|
||||
|
||||
# Output the venv prompt; color taken from the blue of the Python logo.
|
||||
printf "%s%s%s" (set_color 4B8BBE) '(venv) ' (set_color normal)
|
||||
|
||||
# Restore the return status of the previous command.
|
||||
echo "exit $old_status" | .
|
||||
# Output the original/"old" prompt.
|
||||
_old_fish_prompt
|
||||
end
|
||||
|
||||
set -gx _OLD_FISH_PROMPT_OVERRIDE "$VIRTUAL_ENV"
|
||||
set -gx VIRTUAL_ENV_PROMPT '(venv) '
|
||||
end
|
||||
8
venv/bin/dotenv
Executable file
8
venv/bin/dotenv
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from dotenv.__main__ import cli
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(cli())
|
||||
8
venv/bin/google-oauthlib-tool
Executable file
8
venv/bin/google-oauthlib-tool
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from google_auth_oauthlib.tool.__main__ import main
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(main())
|
||||
8
venv/bin/normalizer
Executable file
8
venv/bin/normalizer
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from charset_normalizer.cli import cli_detect
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(cli_detect())
|
||||
8
venv/bin/pip
Executable file
8
venv/bin/pip
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from pip._internal.cli.main import main
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(main())
|
||||
8
venv/bin/pip3
Executable file
8
venv/bin/pip3
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from pip._internal.cli.main import main
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(main())
|
||||
8
venv/bin/pip3.12
Executable file
8
venv/bin/pip3.12
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from pip._internal.cli.main import main
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(main())
|
||||
8
venv/bin/pyrsa-decrypt
Executable file
8
venv/bin/pyrsa-decrypt
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from rsa.cli import decrypt
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(decrypt())
|
||||
8
venv/bin/pyrsa-encrypt
Executable file
8
venv/bin/pyrsa-encrypt
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from rsa.cli import encrypt
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(encrypt())
|
||||
8
venv/bin/pyrsa-keygen
Executable file
8
venv/bin/pyrsa-keygen
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from rsa.cli import keygen
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(keygen())
|
||||
8
venv/bin/pyrsa-priv2pub
Executable file
8
venv/bin/pyrsa-priv2pub
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from rsa.util import private_to_public
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(private_to_public())
|
||||
8
venv/bin/pyrsa-sign
Executable file
8
venv/bin/pyrsa-sign
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from rsa.cli import sign
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(sign())
|
||||
8
venv/bin/pyrsa-verify
Executable file
8
venv/bin/pyrsa-verify
Executable file
@ -0,0 +1,8 @@
|
||||
#!/home/suherdy/Pythoncode/google_map_review/venv/bin/python3
|
||||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
from rsa.cli import verify
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(verify())
|
||||
1
venv/bin/python
Symbolic link
1
venv/bin/python
Symbolic link
@ -0,0 +1 @@
|
||||
python3
|
||||
1
venv/bin/python3
Symbolic link
1
venv/bin/python3
Symbolic link
@ -0,0 +1 @@
|
||||
/usr/bin/python3
|
||||
1
venv/bin/python3.12
Symbolic link
1
venv/bin/python3.12
Symbolic link
@ -0,0 +1 @@
|
||||
python3
|
||||
Binary file not shown.
BIN
venv/lib/python3.12/site-packages/_cffi_backend.cpython-312-x86_64-linux-gnu.so
Executable file
BIN
venv/lib/python3.12/site-packages/_cffi_backend.cpython-312-x86_64-linux-gnu.so
Executable file
Binary file not shown.
27
venv/lib/python3.12/site-packages/apiclient/__init__.py
Normal file
27
venv/lib/python3.12/site-packages/apiclient/__init__.py
Normal file
@ -0,0 +1,27 @@
|
||||
"""Retain apiclient as an alias for googleapiclient."""
|
||||
|
||||
from googleapiclient import channel, discovery, errors, http, mimeparse, model
|
||||
|
||||
try:
|
||||
from googleapiclient import sample_tools
|
||||
except ImportError:
|
||||
# Silently ignore, because the vast majority of consumers won't use it and
|
||||
# it has deep dependence on oauth2client, an optional dependency.
|
||||
sample_tools = None
|
||||
from googleapiclient import schema
|
||||
|
||||
_SUBMODULES = {
|
||||
"channel": channel,
|
||||
"discovery": discovery,
|
||||
"errors": errors,
|
||||
"http": http,
|
||||
"mimeparse": mimeparse,
|
||||
"model": model,
|
||||
"sample_tools": sample_tools,
|
||||
"schema": schema,
|
||||
}
|
||||
|
||||
import sys
|
||||
|
||||
for module_name, module in _SUBMODULES.items():
|
||||
sys.modules["apiclient.%s" % module_name] = module
|
||||
Binary file not shown.
@ -0,0 +1 @@
|
||||
pip
|
||||
@ -0,0 +1,78 @@
|
||||
Metadata-Version: 2.4
|
||||
Name: certifi
|
||||
Version: 2026.1.4
|
||||
Summary: Python package for providing Mozilla's CA Bundle.
|
||||
Home-page: https://github.com/certifi/python-certifi
|
||||
Author: Kenneth Reitz
|
||||
Author-email: me@kennethreitz.com
|
||||
License: MPL-2.0
|
||||
Project-URL: Source, https://github.com/certifi/python-certifi
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)
|
||||
Classifier: Natural Language :: English
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3 :: Only
|
||||
Classifier: Programming Language :: Python :: 3.7
|
||||
Classifier: Programming Language :: Python :: 3.8
|
||||
Classifier: Programming Language :: Python :: 3.9
|
||||
Classifier: Programming Language :: Python :: 3.10
|
||||
Classifier: Programming Language :: Python :: 3.11
|
||||
Classifier: Programming Language :: Python :: 3.12
|
||||
Classifier: Programming Language :: Python :: 3.13
|
||||
Classifier: Programming Language :: Python :: 3.14
|
||||
Requires-Python: >=3.7
|
||||
License-File: LICENSE
|
||||
Dynamic: author
|
||||
Dynamic: author-email
|
||||
Dynamic: classifier
|
||||
Dynamic: description
|
||||
Dynamic: home-page
|
||||
Dynamic: license
|
||||
Dynamic: license-file
|
||||
Dynamic: project-url
|
||||
Dynamic: requires-python
|
||||
Dynamic: summary
|
||||
|
||||
Certifi: Python SSL Certificates
|
||||
================================
|
||||
|
||||
Certifi provides Mozilla's carefully curated collection of Root Certificates for
|
||||
validating the trustworthiness of SSL certificates while verifying the identity
|
||||
of TLS hosts. It has been extracted from the `Requests`_ project.
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
``certifi`` is available on PyPI. Simply install it with ``pip``::
|
||||
|
||||
$ pip install certifi
|
||||
|
||||
Usage
|
||||
-----
|
||||
|
||||
To reference the installed certificate authority (CA) bundle, you can use the
|
||||
built-in function::
|
||||
|
||||
>>> import certifi
|
||||
|
||||
>>> certifi.where()
|
||||
'/usr/local/lib/python3.7/site-packages/certifi/cacert.pem'
|
||||
|
||||
Or from the command line::
|
||||
|
||||
$ python -m certifi
|
||||
/usr/local/lib/python3.7/site-packages/certifi/cacert.pem
|
||||
|
||||
Enjoy!
|
||||
|
||||
.. _`Requests`: https://requests.readthedocs.io/en/master/
|
||||
|
||||
Addition/Removal of Certificates
|
||||
--------------------------------
|
||||
|
||||
Certifi does not support any addition/removal or other modification of the
|
||||
CA trust store content. This project is intended to provide a reliable and
|
||||
highly portable root of trust to python deployments. Look to upstream projects
|
||||
for methods to use alternate trust.
|
||||
@ -0,0 +1,14 @@
|
||||
certifi-2026.1.4.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
certifi-2026.1.4.dist-info/METADATA,sha256=FSfJEfKuMo6bJlofUrtRpn4PFTYtbYyXpHN_A3ZFpIY,2473
|
||||
certifi-2026.1.4.dist-info/RECORD,,
|
||||
certifi-2026.1.4.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
|
||||
certifi-2026.1.4.dist-info/licenses/LICENSE,sha256=6TcW2mucDVpKHfYP5pWzcPBpVgPSH2-D8FPkLPwQyvc,989
|
||||
certifi-2026.1.4.dist-info/top_level.txt,sha256=KMu4vUCfsjLrkPbSNdgdekS-pVJzBAJFO__nI8NF6-U,8
|
||||
certifi/__init__.py,sha256=969deMMS7Uchipr0oO4dbRBUvRi0uNYCn07VmG1aTrg,94
|
||||
certifi/__main__.py,sha256=xBBoj905TUWBLRGANOcf7oi6e-3dMP4cEoG9OyMs11g,243
|
||||
certifi/__pycache__/__init__.cpython-312.pyc,,
|
||||
certifi/__pycache__/__main__.cpython-312.pyc,,
|
||||
certifi/__pycache__/core.cpython-312.pyc,,
|
||||
certifi/cacert.pem,sha256=Tzl1_zCrvzVEO0hgZK6Ly0Hf9wf_31dsdtKS-0WKoKk,270954
|
||||
certifi/core.py,sha256=XFXycndG5pf37ayeF8N32HUuDafsyhkVMbO4BAPWHa0,3394
|
||||
certifi/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||
@ -0,0 +1,5 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: setuptools (80.9.0)
|
||||
Root-Is-Purelib: true
|
||||
Tag: py3-none-any
|
||||
|
||||
@ -0,0 +1,20 @@
|
||||
This package contains a modified version of ca-bundle.crt:
|
||||
|
||||
ca-bundle.crt -- Bundle of CA Root Certificates
|
||||
|
||||
This is a bundle of X.509 certificates of public Certificate Authorities
|
||||
(CA). These were automatically extracted from Mozilla's root certificates
|
||||
file (certdata.txt). This file can be found in the mozilla source tree:
|
||||
https://hg.mozilla.org/mozilla-central/file/tip/security/nss/lib/ckfw/builtins/certdata.txt
|
||||
It contains the certificates in PEM format and therefore
|
||||
can be directly used with curl / libcurl / php_curl, or with
|
||||
an Apache+mod_ssl webserver for SSL client authentication.
|
||||
Just configure this file as the SSLCACertificateFile.#
|
||||
|
||||
***** BEGIN LICENSE BLOCK *****
|
||||
This Source Code Form is subject to the terms of the Mozilla Public License,
|
||||
v. 2.0. If a copy of the MPL was not distributed with this file, You can obtain
|
||||
one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
***** END LICENSE BLOCK *****
|
||||
@(#) $RCSfile: certdata.txt,v $ $Revision: 1.80 $ $Date: 2011/11/03 15:11:58 $
|
||||
@ -0,0 +1 @@
|
||||
certifi
|
||||
4
venv/lib/python3.12/site-packages/certifi/__init__.py
Normal file
4
venv/lib/python3.12/site-packages/certifi/__init__.py
Normal file
@ -0,0 +1,4 @@
|
||||
from .core import contents, where
|
||||
|
||||
__all__ = ["contents", "where"]
|
||||
__version__ = "2026.01.04"
|
||||
12
venv/lib/python3.12/site-packages/certifi/__main__.py
Normal file
12
venv/lib/python3.12/site-packages/certifi/__main__.py
Normal file
@ -0,0 +1,12 @@
|
||||
import argparse
|
||||
|
||||
from certifi import contents, where
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument("-c", "--contents", action="store_true")
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.contents:
|
||||
print(contents())
|
||||
else:
|
||||
print(where())
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
4468
venv/lib/python3.12/site-packages/certifi/cacert.pem
Normal file
4468
venv/lib/python3.12/site-packages/certifi/cacert.pem
Normal file
File diff suppressed because it is too large
Load Diff
83
venv/lib/python3.12/site-packages/certifi/core.py
Normal file
83
venv/lib/python3.12/site-packages/certifi/core.py
Normal file
@ -0,0 +1,83 @@
|
||||
"""
|
||||
certifi.py
|
||||
~~~~~~~~~~
|
||||
|
||||
This module returns the installation location of cacert.pem or its contents.
|
||||
"""
|
||||
import sys
|
||||
import atexit
|
||||
|
||||
def exit_cacert_ctx() -> None:
|
||||
_CACERT_CTX.__exit__(None, None, None) # type: ignore[union-attr]
|
||||
|
||||
|
||||
if sys.version_info >= (3, 11):
|
||||
|
||||
from importlib.resources import as_file, files
|
||||
|
||||
_CACERT_CTX = None
|
||||
_CACERT_PATH = None
|
||||
|
||||
def where() -> str:
|
||||
# This is slightly terrible, but we want to delay extracting the file
|
||||
# in cases where we're inside of a zipimport situation until someone
|
||||
# actually calls where(), but we don't want to re-extract the file
|
||||
# on every call of where(), so we'll do it once then store it in a
|
||||
# global variable.
|
||||
global _CACERT_CTX
|
||||
global _CACERT_PATH
|
||||
if _CACERT_PATH is None:
|
||||
# This is slightly janky, the importlib.resources API wants you to
|
||||
# manage the cleanup of this file, so it doesn't actually return a
|
||||
# path, it returns a context manager that will give you the path
|
||||
# when you enter it and will do any cleanup when you leave it. In
|
||||
# the common case of not needing a temporary file, it will just
|
||||
# return the file system location and the __exit__() is a no-op.
|
||||
#
|
||||
# We also have to hold onto the actual context manager, because
|
||||
# it will do the cleanup whenever it gets garbage collected, so
|
||||
# we will also store that at the global level as well.
|
||||
_CACERT_CTX = as_file(files("certifi").joinpath("cacert.pem"))
|
||||
_CACERT_PATH = str(_CACERT_CTX.__enter__())
|
||||
atexit.register(exit_cacert_ctx)
|
||||
|
||||
return _CACERT_PATH
|
||||
|
||||
def contents() -> str:
|
||||
return files("certifi").joinpath("cacert.pem").read_text(encoding="ascii")
|
||||
|
||||
else:
|
||||
|
||||
from importlib.resources import path as get_path, read_text
|
||||
|
||||
_CACERT_CTX = None
|
||||
_CACERT_PATH = None
|
||||
|
||||
def where() -> str:
|
||||
# This is slightly terrible, but we want to delay extracting the
|
||||
# file in cases where we're inside of a zipimport situation until
|
||||
# someone actually calls where(), but we don't want to re-extract
|
||||
# the file on every call of where(), so we'll do it once then store
|
||||
# it in a global variable.
|
||||
global _CACERT_CTX
|
||||
global _CACERT_PATH
|
||||
if _CACERT_PATH is None:
|
||||
# This is slightly janky, the importlib.resources API wants you
|
||||
# to manage the cleanup of this file, so it doesn't actually
|
||||
# return a path, it returns a context manager that will give
|
||||
# you the path when you enter it and will do any cleanup when
|
||||
# you leave it. In the common case of not needing a temporary
|
||||
# file, it will just return the file system location and the
|
||||
# __exit__() is a no-op.
|
||||
#
|
||||
# We also have to hold onto the actual context manager, because
|
||||
# it will do the cleanup whenever it gets garbage collected, so
|
||||
# we will also store that at the global level as well.
|
||||
_CACERT_CTX = get_path("certifi", "cacert.pem")
|
||||
_CACERT_PATH = str(_CACERT_CTX.__enter__())
|
||||
atexit.register(exit_cacert_ctx)
|
||||
|
||||
return _CACERT_PATH
|
||||
|
||||
def contents() -> str:
|
||||
return read_text("certifi", "cacert.pem", encoding="ascii")
|
||||
0
venv/lib/python3.12/site-packages/certifi/py.typed
Normal file
0
venv/lib/python3.12/site-packages/certifi/py.typed
Normal file
@ -0,0 +1 @@
|
||||
pip
|
||||
@ -0,0 +1,68 @@
|
||||
Metadata-Version: 2.4
|
||||
Name: cffi
|
||||
Version: 2.0.0
|
||||
Summary: Foreign Function Interface for Python calling C code.
|
||||
Author: Armin Rigo, Maciej Fijalkowski
|
||||
Maintainer: Matt Davis, Matt Clay, Matti Picus
|
||||
License-Expression: MIT
|
||||
Project-URL: Documentation, https://cffi.readthedocs.io/
|
||||
Project-URL: Changelog, https://cffi.readthedocs.io/en/latest/whatsnew.html
|
||||
Project-URL: Downloads, https://github.com/python-cffi/cffi/releases
|
||||
Project-URL: Contact, https://groups.google.com/forum/#!forum/python-cffi
|
||||
Project-URL: Source Code, https://github.com/python-cffi/cffi
|
||||
Project-URL: Issue Tracker, https://github.com/python-cffi/cffi/issues
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.9
|
||||
Classifier: Programming Language :: Python :: 3.10
|
||||
Classifier: Programming Language :: Python :: 3.11
|
||||
Classifier: Programming Language :: Python :: 3.12
|
||||
Classifier: Programming Language :: Python :: 3.13
|
||||
Classifier: Programming Language :: Python :: 3.14
|
||||
Classifier: Programming Language :: Python :: Free Threading :: 2 - Beta
|
||||
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||
Requires-Python: >=3.9
|
||||
Description-Content-Type: text/markdown
|
||||
License-File: LICENSE
|
||||
License-File: AUTHORS
|
||||
Requires-Dist: pycparser; implementation_name != "PyPy"
|
||||
Dynamic: license-file
|
||||
|
||||
[](https://github.com/python-cffi/cffi/actions/workflows/ci.yaml?query=branch%3Amain++)
|
||||
[](https://pypi.org/project/cffi)
|
||||
[][Documentation]
|
||||
|
||||
|
||||
CFFI
|
||||
====
|
||||
|
||||
Foreign Function Interface for Python calling C code.
|
||||
|
||||
Please see the [Documentation] or uncompiled in the `doc/` subdirectory.
|
||||
|
||||
Download
|
||||
--------
|
||||
|
||||
[Download page](https://github.com/python-cffi/cffi/releases)
|
||||
|
||||
Source Code
|
||||
-----------
|
||||
|
||||
Source code is publicly available on
|
||||
[GitHub](https://github.com/python-cffi/cffi).
|
||||
|
||||
Contact
|
||||
-------
|
||||
|
||||
[Mailing list](https://groups.google.com/forum/#!forum/python-cffi)
|
||||
|
||||
Testing/development tips
|
||||
------------------------
|
||||
|
||||
After `git clone` or `wget && tar`, we will get a directory called `cffi` or `cffi-x.x.x`. we call it `repo-directory`. To run tests under CPython, run the following in the `repo-directory`:
|
||||
|
||||
pip install pytest
|
||||
pip install -e . # editable install of CFFI for local development
|
||||
pytest src/c/ testing/
|
||||
|
||||
[Documentation]: http://cffi.readthedocs.org/
|
||||
@ -0,0 +1,49 @@
|
||||
_cffi_backend.cpython-312-x86_64-linux-gnu.so,sha256=AGLtw5fn9u4Cmwk3BbGlsXG7VZEvQekABMyEGuRZmcE,348808
|
||||
cffi-2.0.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
cffi-2.0.0.dist-info/METADATA,sha256=uYzn40F68Im8EtXHNBLZs7FoPM-OxzyYbDWsjJvhujk,2559
|
||||
cffi-2.0.0.dist-info/RECORD,,
|
||||
cffi-2.0.0.dist-info/WHEEL,sha256=aSgG0F4rGPZtV0iTEIfy6dtHq6g67Lze3uLfk0vWn88,151
|
||||
cffi-2.0.0.dist-info/entry_points.txt,sha256=y6jTxnyeuLnL-XJcDv8uML3n6wyYiGRg8MTp_QGJ9Ho,75
|
||||
cffi-2.0.0.dist-info/licenses/AUTHORS,sha256=KmemC7-zN1nWfWRf8TG45ta8TK_CMtdR_Kw-2k0xTMg,208
|
||||
cffi-2.0.0.dist-info/licenses/LICENSE,sha256=W6JN3FcGf5JJrdZEw6_EGl1tw34jQz73Wdld83Cwr2M,1123
|
||||
cffi-2.0.0.dist-info/top_level.txt,sha256=rE7WR3rZfNKxWI9-jn6hsHCAl7MDkB-FmuQbxWjFehQ,19
|
||||
cffi/__init__.py,sha256=-ksBQ7MfDzVvbBlV_ftYBWAmEqfA86ljIzMxzaZeAlI,511
|
||||
cffi/__pycache__/__init__.cpython-312.pyc,,
|
||||
cffi/__pycache__/_imp_emulation.cpython-312.pyc,,
|
||||
cffi/__pycache__/_shimmed_dist_utils.cpython-312.pyc,,
|
||||
cffi/__pycache__/api.cpython-312.pyc,,
|
||||
cffi/__pycache__/backend_ctypes.cpython-312.pyc,,
|
||||
cffi/__pycache__/cffi_opcode.cpython-312.pyc,,
|
||||
cffi/__pycache__/commontypes.cpython-312.pyc,,
|
||||
cffi/__pycache__/cparser.cpython-312.pyc,,
|
||||
cffi/__pycache__/error.cpython-312.pyc,,
|
||||
cffi/__pycache__/ffiplatform.cpython-312.pyc,,
|
||||
cffi/__pycache__/lock.cpython-312.pyc,,
|
||||
cffi/__pycache__/model.cpython-312.pyc,,
|
||||
cffi/__pycache__/pkgconfig.cpython-312.pyc,,
|
||||
cffi/__pycache__/recompiler.cpython-312.pyc,,
|
||||
cffi/__pycache__/setuptools_ext.cpython-312.pyc,,
|
||||
cffi/__pycache__/vengine_cpy.cpython-312.pyc,,
|
||||
cffi/__pycache__/vengine_gen.cpython-312.pyc,,
|
||||
cffi/__pycache__/verifier.cpython-312.pyc,,
|
||||
cffi/_cffi_errors.h,sha256=zQXt7uR_m8gUW-fI2hJg0KoSkJFwXv8RGUkEDZ177dQ,3908
|
||||
cffi/_cffi_include.h,sha256=Exhmgm9qzHWzWivjfTe0D7Xp4rPUkVxdNuwGhMTMzbw,15055
|
||||
cffi/_embedding.h,sha256=Ai33FHblE7XSpHOCp8kPcWwN5_9BV14OvN0JVa6ITpw,18786
|
||||
cffi/_imp_emulation.py,sha256=RxREG8zAbI2RPGBww90u_5fi8sWdahpdipOoPzkp7C0,2960
|
||||
cffi/_shimmed_dist_utils.py,sha256=Bjj2wm8yZbvFvWEx5AEfmqaqZyZFhYfoyLLQHkXZuao,2230
|
||||
cffi/api.py,sha256=alBv6hZQkjpmZplBphdaRn2lPO9-CORs_M7ixabvZWI,42169
|
||||
cffi/backend_ctypes.py,sha256=h5ZIzLc6BFVXnGyc9xPqZWUS7qGy7yFSDqXe68Sa8z4,42454
|
||||
cffi/cffi_opcode.py,sha256=JDV5l0R0_OadBX_uE7xPPTYtMdmpp8I9UYd6av7aiDU,5731
|
||||
cffi/commontypes.py,sha256=7N6zPtCFlvxXMWhHV08psUjdYIK2XgsN3yo5dgua_v4,2805
|
||||
cffi/cparser.py,sha256=QUTfmlL-aO-MYR8bFGlvAUHc36OQr7XYLe0WLkGFjRo,44790
|
||||
cffi/error.py,sha256=v6xTiS4U0kvDcy4h_BDRo5v39ZQuj-IMRYLv5ETddZs,877
|
||||
cffi/ffiplatform.py,sha256=avxFjdikYGJoEtmJO7ewVmwG_VEVl6EZ_WaNhZYCqv4,3584
|
||||
cffi/lock.py,sha256=l9TTdwMIMpi6jDkJGnQgE9cvTIR7CAntIJr8EGHt3pY,747
|
||||
cffi/model.py,sha256=W30UFQZE73jL5Mx5N81YT77us2W2iJjTm0XYfnwz1cg,21797
|
||||
cffi/parse_c_type.h,sha256=OdwQfwM9ktq6vlCB43exFQmxDBtj2MBNdK8LYl15tjw,5976
|
||||
cffi/pkgconfig.py,sha256=LP1w7vmWvmKwyqLaU1Z243FOWGNQMrgMUZrvgFuOlco,4374
|
||||
cffi/recompiler.py,sha256=78J6lMEEOygXNmjN9-fOFFO3j7eW-iFxSrxfvQb54bY,65509
|
||||
cffi/setuptools_ext.py,sha256=0rCwBJ1W7FHWtiMKfNXsSST88V8UXrui5oeXFlDNLG8,9411
|
||||
cffi/vengine_cpy.py,sha256=oyQKD23kpE0aChUKA8Jg0e723foPiYzLYEdb-J0MiNs,43881
|
||||
cffi/vengine_gen.py,sha256=DUlEIrDiVin1Pnhn1sfoamnS5NLqfJcOdhRoeSNeJRg,26939
|
||||
cffi/verifier.py,sha256=oX8jpaohg2Qm3aHcznidAdvrVm5N4sQYG0a3Eo5mIl4,11182
|
||||
@ -0,0 +1,6 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: setuptools (80.9.0)
|
||||
Root-Is-Purelib: false
|
||||
Tag: cp312-cp312-manylinux_2_17_x86_64
|
||||
Tag: cp312-cp312-manylinux2014_x86_64
|
||||
|
||||
@ -0,0 +1,2 @@
|
||||
[distutils.setup_keywords]
|
||||
cffi_modules = cffi.setuptools_ext:cffi_modules
|
||||
@ -0,0 +1,8 @@
|
||||
This package has been mostly done by Armin Rigo with help from
|
||||
Maciej Fijałkowski. The idea is heavily based (although not directly
|
||||
copied) from LuaJIT ffi by Mike Pall.
|
||||
|
||||
|
||||
Other contributors:
|
||||
|
||||
Google Inc.
|
||||
@ -0,0 +1,23 @@
|
||||
|
||||
Except when otherwise stated (look for LICENSE files in directories or
|
||||
information at the beginning of each file) all software and
|
||||
documentation is licensed as follows:
|
||||
|
||||
MIT No Attribution
|
||||
|
||||
Permission is hereby granted, free of charge, to any person
|
||||
obtaining a copy of this software and associated documentation
|
||||
files (the "Software"), to deal in the Software without
|
||||
restriction, including without limitation the rights to use,
|
||||
copy, modify, merge, publish, distribute, sublicense, and/or
|
||||
sell copies of the Software, and to permit persons to whom the
|
||||
Software is furnished to do so.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
|
||||
OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
|
||||
THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
|
||||
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
|
||||
DEALINGS IN THE SOFTWARE.
|
||||
|
||||
@ -0,0 +1,2 @@
|
||||
_cffi_backend
|
||||
cffi
|
||||
14
venv/lib/python3.12/site-packages/cffi/__init__.py
Normal file
14
venv/lib/python3.12/site-packages/cffi/__init__.py
Normal file
@ -0,0 +1,14 @@
|
||||
__all__ = ['FFI', 'VerificationError', 'VerificationMissing', 'CDefError',
|
||||
'FFIError']
|
||||
|
||||
from .api import FFI
|
||||
from .error import CDefError, FFIError, VerificationError, VerificationMissing
|
||||
from .error import PkgConfigError
|
||||
|
||||
__version__ = "2.0.0"
|
||||
__version_info__ = (2, 0, 0)
|
||||
|
||||
# The verifier module file names are based on the CRC32 of a string that
|
||||
# contains the following version number. It may be older than __version__
|
||||
# if nothing is clearly incompatible.
|
||||
__version_verifier_modules__ = "0.8.6"
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
149
venv/lib/python3.12/site-packages/cffi/_cffi_errors.h
Normal file
149
venv/lib/python3.12/site-packages/cffi/_cffi_errors.h
Normal file
@ -0,0 +1,149 @@
|
||||
#ifndef CFFI_MESSAGEBOX
|
||||
# ifdef _MSC_VER
|
||||
# define CFFI_MESSAGEBOX 1
|
||||
# else
|
||||
# define CFFI_MESSAGEBOX 0
|
||||
# endif
|
||||
#endif
|
||||
|
||||
|
||||
#if CFFI_MESSAGEBOX
|
||||
/* Windows only: logic to take the Python-CFFI embedding logic
|
||||
initialization errors and display them in a background thread
|
||||
with MessageBox. The idea is that if the whole program closes
|
||||
as a result of this problem, then likely it is already a console
|
||||
program and you can read the stderr output in the console too.
|
||||
If it is not a console program, then it will likely show its own
|
||||
dialog to complain, or generally not abruptly close, and for this
|
||||
case the background thread should stay alive.
|
||||
*/
|
||||
static void *volatile _cffi_bootstrap_text;
|
||||
|
||||
static PyObject *_cffi_start_error_capture(void)
|
||||
{
|
||||
PyObject *result = NULL;
|
||||
PyObject *x, *m, *bi;
|
||||
|
||||
if (InterlockedCompareExchangePointer(&_cffi_bootstrap_text,
|
||||
(void *)1, NULL) != NULL)
|
||||
return (PyObject *)1;
|
||||
|
||||
m = PyImport_AddModule("_cffi_error_capture");
|
||||
if (m == NULL)
|
||||
goto error;
|
||||
|
||||
result = PyModule_GetDict(m);
|
||||
if (result == NULL)
|
||||
goto error;
|
||||
|
||||
#if PY_MAJOR_VERSION >= 3
|
||||
bi = PyImport_ImportModule("builtins");
|
||||
#else
|
||||
bi = PyImport_ImportModule("__builtin__");
|
||||
#endif
|
||||
if (bi == NULL)
|
||||
goto error;
|
||||
PyDict_SetItemString(result, "__builtins__", bi);
|
||||
Py_DECREF(bi);
|
||||
|
||||
x = PyRun_String(
|
||||
"import sys\n"
|
||||
"class FileLike:\n"
|
||||
" def write(self, x):\n"
|
||||
" try:\n"
|
||||
" of.write(x)\n"
|
||||
" except: pass\n"
|
||||
" self.buf += x\n"
|
||||
" def flush(self):\n"
|
||||
" pass\n"
|
||||
"fl = FileLike()\n"
|
||||
"fl.buf = ''\n"
|
||||
"of = sys.stderr\n"
|
||||
"sys.stderr = fl\n"
|
||||
"def done():\n"
|
||||
" sys.stderr = of\n"
|
||||
" return fl.buf\n", /* make sure the returned value stays alive */
|
||||
Py_file_input,
|
||||
result, result);
|
||||
Py_XDECREF(x);
|
||||
|
||||
error:
|
||||
if (PyErr_Occurred())
|
||||
{
|
||||
PyErr_WriteUnraisable(Py_None);
|
||||
PyErr_Clear();
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
#pragma comment(lib, "user32.lib")
|
||||
|
||||
static DWORD WINAPI _cffi_bootstrap_dialog(LPVOID ignored)
|
||||
{
|
||||
Sleep(666); /* may be interrupted if the whole process is closing */
|
||||
#if PY_MAJOR_VERSION >= 3
|
||||
MessageBoxW(NULL, (wchar_t *)_cffi_bootstrap_text,
|
||||
L"Python-CFFI error",
|
||||
MB_OK | MB_ICONERROR);
|
||||
#else
|
||||
MessageBoxA(NULL, (char *)_cffi_bootstrap_text,
|
||||
"Python-CFFI error",
|
||||
MB_OK | MB_ICONERROR);
|
||||
#endif
|
||||
_cffi_bootstrap_text = NULL;
|
||||
return 0;
|
||||
}
|
||||
|
||||
static void _cffi_stop_error_capture(PyObject *ecap)
|
||||
{
|
||||
PyObject *s;
|
||||
void *text;
|
||||
|
||||
if (ecap == (PyObject *)1)
|
||||
return;
|
||||
|
||||
if (ecap == NULL)
|
||||
goto error;
|
||||
|
||||
s = PyRun_String("done()", Py_eval_input, ecap, ecap);
|
||||
if (s == NULL)
|
||||
goto error;
|
||||
|
||||
/* Show a dialog box, but in a background thread, and
|
||||
never show multiple dialog boxes at once. */
|
||||
#if PY_MAJOR_VERSION >= 3
|
||||
text = PyUnicode_AsWideCharString(s, NULL);
|
||||
#else
|
||||
text = PyString_AsString(s);
|
||||
#endif
|
||||
|
||||
_cffi_bootstrap_text = text;
|
||||
|
||||
if (text != NULL)
|
||||
{
|
||||
HANDLE h;
|
||||
h = CreateThread(NULL, 0, _cffi_bootstrap_dialog,
|
||||
NULL, 0, NULL);
|
||||
if (h != NULL)
|
||||
CloseHandle(h);
|
||||
}
|
||||
/* decref the string, but it should stay alive as 'fl.buf'
|
||||
in the small module above. It will really be freed only if
|
||||
we later get another similar error. So it's a leak of at
|
||||
most one copy of the small module. That's fine for this
|
||||
situation which is usually a "fatal error" anyway. */
|
||||
Py_DECREF(s);
|
||||
PyErr_Clear();
|
||||
return;
|
||||
|
||||
error:
|
||||
_cffi_bootstrap_text = NULL;
|
||||
PyErr_Clear();
|
||||
}
|
||||
|
||||
#else
|
||||
|
||||
static PyObject *_cffi_start_error_capture(void) { return NULL; }
|
||||
static void _cffi_stop_error_capture(PyObject *ecap) { }
|
||||
|
||||
#endif
|
||||
389
venv/lib/python3.12/site-packages/cffi/_cffi_include.h
Normal file
389
venv/lib/python3.12/site-packages/cffi/_cffi_include.h
Normal file
@ -0,0 +1,389 @@
|
||||
#define _CFFI_
|
||||
|
||||
/* We try to define Py_LIMITED_API before including Python.h.
|
||||
|
||||
Mess: we can only define it if Py_DEBUG, Py_TRACE_REFS and
|
||||
Py_REF_DEBUG are not defined. This is a best-effort approximation:
|
||||
we can learn about Py_DEBUG from pyconfig.h, but it is unclear if
|
||||
the same works for the other two macros. Py_DEBUG implies them,
|
||||
but not the other way around.
|
||||
|
||||
The implementation is messy (issue #350): on Windows, with _MSC_VER,
|
||||
we have to define Py_LIMITED_API even before including pyconfig.h.
|
||||
In that case, we guess what pyconfig.h will do to the macros above,
|
||||
and check our guess after the #include.
|
||||
|
||||
Note that on Windows, with CPython 3.x, you need >= 3.5 and virtualenv
|
||||
version >= 16.0.0. With older versions of either, you don't get a
|
||||
copy of PYTHON3.DLL in the virtualenv. We can't check the version of
|
||||
CPython *before* we even include pyconfig.h. ffi.set_source() puts
|
||||
a ``#define _CFFI_NO_LIMITED_API'' at the start of this file if it is
|
||||
running on Windows < 3.5, as an attempt at fixing it, but that's
|
||||
arguably wrong because it may not be the target version of Python.
|
||||
Still better than nothing I guess. As another workaround, you can
|
||||
remove the definition of Py_LIMITED_API here.
|
||||
|
||||
See also 'py_limited_api' in cffi/setuptools_ext.py.
|
||||
*/
|
||||
#if !defined(_CFFI_USE_EMBEDDING) && !defined(Py_LIMITED_API)
|
||||
# ifdef _MSC_VER
|
||||
# if !defined(_DEBUG) && !defined(Py_DEBUG) && !defined(Py_TRACE_REFS) && !defined(Py_REF_DEBUG) && !defined(_CFFI_NO_LIMITED_API)
|
||||
# define Py_LIMITED_API
|
||||
# endif
|
||||
# include <pyconfig.h>
|
||||
/* sanity-check: Py_LIMITED_API will cause crashes if any of these
|
||||
are also defined. Normally, the Python file PC/pyconfig.h does not
|
||||
cause any of these to be defined, with the exception that _DEBUG
|
||||
causes Py_DEBUG. Double-check that. */
|
||||
# ifdef Py_LIMITED_API
|
||||
# if defined(Py_DEBUG)
|
||||
# error "pyconfig.h unexpectedly defines Py_DEBUG, but Py_LIMITED_API is set"
|
||||
# endif
|
||||
# if defined(Py_TRACE_REFS)
|
||||
# error "pyconfig.h unexpectedly defines Py_TRACE_REFS, but Py_LIMITED_API is set"
|
||||
# endif
|
||||
# if defined(Py_REF_DEBUG)
|
||||
# error "pyconfig.h unexpectedly defines Py_REF_DEBUG, but Py_LIMITED_API is set"
|
||||
# endif
|
||||
# endif
|
||||
# else
|
||||
# include <pyconfig.h>
|
||||
# if !defined(Py_DEBUG) && !defined(Py_TRACE_REFS) && !defined(Py_REF_DEBUG) && !defined(_CFFI_NO_LIMITED_API)
|
||||
# define Py_LIMITED_API
|
||||
# endif
|
||||
# endif
|
||||
#endif
|
||||
|
||||
#include <Python.h>
|
||||
#ifdef __cplusplus
|
||||
extern "C" {
|
||||
#endif
|
||||
#include <stddef.h>
|
||||
#include "parse_c_type.h"
|
||||
|
||||
/* this block of #ifs should be kept exactly identical between
|
||||
c/_cffi_backend.c, cffi/vengine_cpy.py, cffi/vengine_gen.py
|
||||
and cffi/_cffi_include.h */
|
||||
#if defined(_MSC_VER)
|
||||
# include <malloc.h> /* for alloca() */
|
||||
# if _MSC_VER < 1600 /* MSVC < 2010 */
|
||||
typedef __int8 int8_t;
|
||||
typedef __int16 int16_t;
|
||||
typedef __int32 int32_t;
|
||||
typedef __int64 int64_t;
|
||||
typedef unsigned __int8 uint8_t;
|
||||
typedef unsigned __int16 uint16_t;
|
||||
typedef unsigned __int32 uint32_t;
|
||||
typedef unsigned __int64 uint64_t;
|
||||
typedef __int8 int_least8_t;
|
||||
typedef __int16 int_least16_t;
|
||||
typedef __int32 int_least32_t;
|
||||
typedef __int64 int_least64_t;
|
||||
typedef unsigned __int8 uint_least8_t;
|
||||
typedef unsigned __int16 uint_least16_t;
|
||||
typedef unsigned __int32 uint_least32_t;
|
||||
typedef unsigned __int64 uint_least64_t;
|
||||
typedef __int8 int_fast8_t;
|
||||
typedef __int16 int_fast16_t;
|
||||
typedef __int32 int_fast32_t;
|
||||
typedef __int64 int_fast64_t;
|
||||
typedef unsigned __int8 uint_fast8_t;
|
||||
typedef unsigned __int16 uint_fast16_t;
|
||||
typedef unsigned __int32 uint_fast32_t;
|
||||
typedef unsigned __int64 uint_fast64_t;
|
||||
typedef __int64 intmax_t;
|
||||
typedef unsigned __int64 uintmax_t;
|
||||
# else
|
||||
# include <stdint.h>
|
||||
# endif
|
||||
# if _MSC_VER < 1800 /* MSVC < 2013 */
|
||||
# ifndef __cplusplus
|
||||
typedef unsigned char _Bool;
|
||||
# endif
|
||||
# endif
|
||||
# define _cffi_float_complex_t _Fcomplex /* include <complex.h> for it */
|
||||
# define _cffi_double_complex_t _Dcomplex /* include <complex.h> for it */
|
||||
#else
|
||||
# include <stdint.h>
|
||||
# if (defined (__SVR4) && defined (__sun)) || defined(_AIX) || defined(__hpux)
|
||||
# include <alloca.h>
|
||||
# endif
|
||||
# define _cffi_float_complex_t float _Complex
|
||||
# define _cffi_double_complex_t double _Complex
|
||||
#endif
|
||||
|
||||
#ifdef __GNUC__
|
||||
# define _CFFI_UNUSED_FN __attribute__((unused))
|
||||
#else
|
||||
# define _CFFI_UNUSED_FN /* nothing */
|
||||
#endif
|
||||
|
||||
#ifdef __cplusplus
|
||||
# ifndef _Bool
|
||||
typedef bool _Bool; /* semi-hackish: C++ has no _Bool; bool is builtin */
|
||||
# endif
|
||||
#endif
|
||||
|
||||
/********** CPython-specific section **********/
|
||||
#ifndef PYPY_VERSION
|
||||
|
||||
|
||||
#if PY_MAJOR_VERSION >= 3
|
||||
# define PyInt_FromLong PyLong_FromLong
|
||||
#endif
|
||||
|
||||
#define _cffi_from_c_double PyFloat_FromDouble
|
||||
#define _cffi_from_c_float PyFloat_FromDouble
|
||||
#define _cffi_from_c_long PyInt_FromLong
|
||||
#define _cffi_from_c_ulong PyLong_FromUnsignedLong
|
||||
#define _cffi_from_c_longlong PyLong_FromLongLong
|
||||
#define _cffi_from_c_ulonglong PyLong_FromUnsignedLongLong
|
||||
#define _cffi_from_c__Bool PyBool_FromLong
|
||||
|
||||
#define _cffi_to_c_double PyFloat_AsDouble
|
||||
#define _cffi_to_c_float PyFloat_AsDouble
|
||||
|
||||
#define _cffi_from_c_int(x, type) \
|
||||
(((type)-1) > 0 ? /* unsigned */ \
|
||||
(sizeof(type) < sizeof(long) ? \
|
||||
PyInt_FromLong((long)x) : \
|
||||
sizeof(type) == sizeof(long) ? \
|
||||
PyLong_FromUnsignedLong((unsigned long)x) : \
|
||||
PyLong_FromUnsignedLongLong((unsigned long long)x)) : \
|
||||
(sizeof(type) <= sizeof(long) ? \
|
||||
PyInt_FromLong((long)x) : \
|
||||
PyLong_FromLongLong((long long)x)))
|
||||
|
||||
#define _cffi_to_c_int(o, type) \
|
||||
((type)( \
|
||||
sizeof(type) == 1 ? (((type)-1) > 0 ? (type)_cffi_to_c_u8(o) \
|
||||
: (type)_cffi_to_c_i8(o)) : \
|
||||
sizeof(type) == 2 ? (((type)-1) > 0 ? (type)_cffi_to_c_u16(o) \
|
||||
: (type)_cffi_to_c_i16(o)) : \
|
||||
sizeof(type) == 4 ? (((type)-1) > 0 ? (type)_cffi_to_c_u32(o) \
|
||||
: (type)_cffi_to_c_i32(o)) : \
|
||||
sizeof(type) == 8 ? (((type)-1) > 0 ? (type)_cffi_to_c_u64(o) \
|
||||
: (type)_cffi_to_c_i64(o)) : \
|
||||
(Py_FatalError("unsupported size for type " #type), (type)0)))
|
||||
|
||||
#define _cffi_to_c_i8 \
|
||||
((int(*)(PyObject *))_cffi_exports[1])
|
||||
#define _cffi_to_c_u8 \
|
||||
((int(*)(PyObject *))_cffi_exports[2])
|
||||
#define _cffi_to_c_i16 \
|
||||
((int(*)(PyObject *))_cffi_exports[3])
|
||||
#define _cffi_to_c_u16 \
|
||||
((int(*)(PyObject *))_cffi_exports[4])
|
||||
#define _cffi_to_c_i32 \
|
||||
((int(*)(PyObject *))_cffi_exports[5])
|
||||
#define _cffi_to_c_u32 \
|
||||
((unsigned int(*)(PyObject *))_cffi_exports[6])
|
||||
#define _cffi_to_c_i64 \
|
||||
((long long(*)(PyObject *))_cffi_exports[7])
|
||||
#define _cffi_to_c_u64 \
|
||||
((unsigned long long(*)(PyObject *))_cffi_exports[8])
|
||||
#define _cffi_to_c_char \
|
||||
((int(*)(PyObject *))_cffi_exports[9])
|
||||
#define _cffi_from_c_pointer \
|
||||
((PyObject *(*)(char *, struct _cffi_ctypedescr *))_cffi_exports[10])
|
||||
#define _cffi_to_c_pointer \
|
||||
((char *(*)(PyObject *, struct _cffi_ctypedescr *))_cffi_exports[11])
|
||||
#define _cffi_get_struct_layout \
|
||||
not used any more
|
||||
#define _cffi_restore_errno \
|
||||
((void(*)(void))_cffi_exports[13])
|
||||
#define _cffi_save_errno \
|
||||
((void(*)(void))_cffi_exports[14])
|
||||
#define _cffi_from_c_char \
|
||||
((PyObject *(*)(char))_cffi_exports[15])
|
||||
#define _cffi_from_c_deref \
|
||||
((PyObject *(*)(char *, struct _cffi_ctypedescr *))_cffi_exports[16])
|
||||
#define _cffi_to_c \
|
||||
((int(*)(char *, struct _cffi_ctypedescr *, PyObject *))_cffi_exports[17])
|
||||
#define _cffi_from_c_struct \
|
||||
((PyObject *(*)(char *, struct _cffi_ctypedescr *))_cffi_exports[18])
|
||||
#define _cffi_to_c_wchar_t \
|
||||
((_cffi_wchar_t(*)(PyObject *))_cffi_exports[19])
|
||||
#define _cffi_from_c_wchar_t \
|
||||
((PyObject *(*)(_cffi_wchar_t))_cffi_exports[20])
|
||||
#define _cffi_to_c_long_double \
|
||||
((long double(*)(PyObject *))_cffi_exports[21])
|
||||
#define _cffi_to_c__Bool \
|
||||
((_Bool(*)(PyObject *))_cffi_exports[22])
|
||||
#define _cffi_prepare_pointer_call_argument \
|
||||
((Py_ssize_t(*)(struct _cffi_ctypedescr *, \
|
||||
PyObject *, char **))_cffi_exports[23])
|
||||
#define _cffi_convert_array_from_object \
|
||||
((int(*)(char *, struct _cffi_ctypedescr *, PyObject *))_cffi_exports[24])
|
||||
#define _CFFI_CPIDX 25
|
||||
#define _cffi_call_python \
|
||||
((void(*)(struct _cffi_externpy_s *, char *))_cffi_exports[_CFFI_CPIDX])
|
||||
#define _cffi_to_c_wchar3216_t \
|
||||
((int(*)(PyObject *))_cffi_exports[26])
|
||||
#define _cffi_from_c_wchar3216_t \
|
||||
((PyObject *(*)(int))_cffi_exports[27])
|
||||
#define _CFFI_NUM_EXPORTS 28
|
||||
|
||||
struct _cffi_ctypedescr;
|
||||
|
||||
static void *_cffi_exports[_CFFI_NUM_EXPORTS];
|
||||
|
||||
#define _cffi_type(index) ( \
|
||||
assert((((uintptr_t)_cffi_types[index]) & 1) == 0), \
|
||||
(struct _cffi_ctypedescr *)_cffi_types[index])
|
||||
|
||||
static PyObject *_cffi_init(const char *module_name, Py_ssize_t version,
|
||||
const struct _cffi_type_context_s *ctx)
|
||||
{
|
||||
PyObject *module, *o_arg, *new_module;
|
||||
void *raw[] = {
|
||||
(void *)module_name,
|
||||
(void *)version,
|
||||
(void *)_cffi_exports,
|
||||
(void *)ctx,
|
||||
};
|
||||
|
||||
module = PyImport_ImportModule("_cffi_backend");
|
||||
if (module == NULL)
|
||||
goto failure;
|
||||
|
||||
o_arg = PyLong_FromVoidPtr((void *)raw);
|
||||
if (o_arg == NULL)
|
||||
goto failure;
|
||||
|
||||
new_module = PyObject_CallMethod(
|
||||
module, (char *)"_init_cffi_1_0_external_module", (char *)"O", o_arg);
|
||||
|
||||
Py_DECREF(o_arg);
|
||||
Py_DECREF(module);
|
||||
return new_module;
|
||||
|
||||
failure:
|
||||
Py_XDECREF(module);
|
||||
return NULL;
|
||||
}
|
||||
|
||||
|
||||
#ifdef HAVE_WCHAR_H
|
||||
typedef wchar_t _cffi_wchar_t;
|
||||
#else
|
||||
typedef uint16_t _cffi_wchar_t; /* same random pick as _cffi_backend.c */
|
||||
#endif
|
||||
|
||||
_CFFI_UNUSED_FN static uint16_t _cffi_to_c_char16_t(PyObject *o)
|
||||
{
|
||||
if (sizeof(_cffi_wchar_t) == 2)
|
||||
return (uint16_t)_cffi_to_c_wchar_t(o);
|
||||
else
|
||||
return (uint16_t)_cffi_to_c_wchar3216_t(o);
|
||||
}
|
||||
|
||||
_CFFI_UNUSED_FN static PyObject *_cffi_from_c_char16_t(uint16_t x)
|
||||
{
|
||||
if (sizeof(_cffi_wchar_t) == 2)
|
||||
return _cffi_from_c_wchar_t((_cffi_wchar_t)x);
|
||||
else
|
||||
return _cffi_from_c_wchar3216_t((int)x);
|
||||
}
|
||||
|
||||
_CFFI_UNUSED_FN static int _cffi_to_c_char32_t(PyObject *o)
|
||||
{
|
||||
if (sizeof(_cffi_wchar_t) == 4)
|
||||
return (int)_cffi_to_c_wchar_t(o);
|
||||
else
|
||||
return (int)_cffi_to_c_wchar3216_t(o);
|
||||
}
|
||||
|
||||
_CFFI_UNUSED_FN static PyObject *_cffi_from_c_char32_t(unsigned int x)
|
||||
{
|
||||
if (sizeof(_cffi_wchar_t) == 4)
|
||||
return _cffi_from_c_wchar_t((_cffi_wchar_t)x);
|
||||
else
|
||||
return _cffi_from_c_wchar3216_t((int)x);
|
||||
}
|
||||
|
||||
union _cffi_union_alignment_u {
|
||||
unsigned char m_char;
|
||||
unsigned short m_short;
|
||||
unsigned int m_int;
|
||||
unsigned long m_long;
|
||||
unsigned long long m_longlong;
|
||||
float m_float;
|
||||
double m_double;
|
||||
long double m_longdouble;
|
||||
};
|
||||
|
||||
struct _cffi_freeme_s {
|
||||
struct _cffi_freeme_s *next;
|
||||
union _cffi_union_alignment_u alignment;
|
||||
};
|
||||
|
||||
_CFFI_UNUSED_FN static int
|
||||
_cffi_convert_array_argument(struct _cffi_ctypedescr *ctptr, PyObject *arg,
|
||||
char **output_data, Py_ssize_t datasize,
|
||||
struct _cffi_freeme_s **freeme)
|
||||
{
|
||||
char *p;
|
||||
if (datasize < 0)
|
||||
return -1;
|
||||
|
||||
p = *output_data;
|
||||
if (p == NULL) {
|
||||
struct _cffi_freeme_s *fp = (struct _cffi_freeme_s *)PyObject_Malloc(
|
||||
offsetof(struct _cffi_freeme_s, alignment) + (size_t)datasize);
|
||||
if (fp == NULL)
|
||||
return -1;
|
||||
fp->next = *freeme;
|
||||
*freeme = fp;
|
||||
p = *output_data = (char *)&fp->alignment;
|
||||
}
|
||||
memset((void *)p, 0, (size_t)datasize);
|
||||
return _cffi_convert_array_from_object(p, ctptr, arg);
|
||||
}
|
||||
|
||||
_CFFI_UNUSED_FN static void
|
||||
_cffi_free_array_arguments(struct _cffi_freeme_s *freeme)
|
||||
{
|
||||
do {
|
||||
void *p = (void *)freeme;
|
||||
freeme = freeme->next;
|
||||
PyObject_Free(p);
|
||||
} while (freeme != NULL);
|
||||
}
|
||||
|
||||
/********** end CPython-specific section **********/
|
||||
#else
|
||||
_CFFI_UNUSED_FN
|
||||
static void (*_cffi_call_python_org)(struct _cffi_externpy_s *, char *);
|
||||
# define _cffi_call_python _cffi_call_python_org
|
||||
#endif
|
||||
|
||||
|
||||
#define _cffi_array_len(array) (sizeof(array) / sizeof((array)[0]))
|
||||
|
||||
#define _cffi_prim_int(size, sign) \
|
||||
((size) == 1 ? ((sign) ? _CFFI_PRIM_INT8 : _CFFI_PRIM_UINT8) : \
|
||||
(size) == 2 ? ((sign) ? _CFFI_PRIM_INT16 : _CFFI_PRIM_UINT16) : \
|
||||
(size) == 4 ? ((sign) ? _CFFI_PRIM_INT32 : _CFFI_PRIM_UINT32) : \
|
||||
(size) == 8 ? ((sign) ? _CFFI_PRIM_INT64 : _CFFI_PRIM_UINT64) : \
|
||||
_CFFI__UNKNOWN_PRIM)
|
||||
|
||||
#define _cffi_prim_float(size) \
|
||||
((size) == sizeof(float) ? _CFFI_PRIM_FLOAT : \
|
||||
(size) == sizeof(double) ? _CFFI_PRIM_DOUBLE : \
|
||||
(size) == sizeof(long double) ? _CFFI__UNKNOWN_LONG_DOUBLE : \
|
||||
_CFFI__UNKNOWN_FLOAT_PRIM)
|
||||
|
||||
#define _cffi_check_int(got, got_nonpos, expected) \
|
||||
((got_nonpos) == (expected <= 0) && \
|
||||
(got) == (unsigned long long)expected)
|
||||
|
||||
#ifdef MS_WIN32
|
||||
# define _cffi_stdcall __stdcall
|
||||
#else
|
||||
# define _cffi_stdcall /* nothing */
|
||||
#endif
|
||||
|
||||
#ifdef __cplusplus
|
||||
}
|
||||
#endif
|
||||
550
venv/lib/python3.12/site-packages/cffi/_embedding.h
Normal file
550
venv/lib/python3.12/site-packages/cffi/_embedding.h
Normal file
@ -0,0 +1,550 @@
|
||||
|
||||
/***** Support code for embedding *****/
|
||||
|
||||
#ifdef __cplusplus
|
||||
extern "C" {
|
||||
#endif
|
||||
|
||||
|
||||
#if defined(_WIN32)
|
||||
# define CFFI_DLLEXPORT __declspec(dllexport)
|
||||
#elif defined(__GNUC__)
|
||||
# define CFFI_DLLEXPORT __attribute__((visibility("default")))
|
||||
#else
|
||||
# define CFFI_DLLEXPORT /* nothing */
|
||||
#endif
|
||||
|
||||
|
||||
/* There are two global variables of type _cffi_call_python_fnptr:
|
||||
|
||||
* _cffi_call_python, which we declare just below, is the one called
|
||||
by ``extern "Python"`` implementations.
|
||||
|
||||
* _cffi_call_python_org, which on CPython is actually part of the
|
||||
_cffi_exports[] array, is the function pointer copied from
|
||||
_cffi_backend. If _cffi_start_python() fails, then this is set
|
||||
to NULL; otherwise, it should never be NULL.
|
||||
|
||||
After initialization is complete, both are equal. However, the
|
||||
first one remains equal to &_cffi_start_and_call_python until the
|
||||
very end of initialization, when we are (or should be) sure that
|
||||
concurrent threads also see a completely initialized world, and
|
||||
only then is it changed.
|
||||
*/
|
||||
#undef _cffi_call_python
|
||||
typedef void (*_cffi_call_python_fnptr)(struct _cffi_externpy_s *, char *);
|
||||
static void _cffi_start_and_call_python(struct _cffi_externpy_s *, char *);
|
||||
static _cffi_call_python_fnptr _cffi_call_python = &_cffi_start_and_call_python;
|
||||
|
||||
|
||||
#ifndef _MSC_VER
|
||||
/* --- Assuming a GCC not infinitely old --- */
|
||||
# define cffi_compare_and_swap(l,o,n) __sync_bool_compare_and_swap(l,o,n)
|
||||
# define cffi_write_barrier() __sync_synchronize()
|
||||
# if !defined(__amd64__) && !defined(__x86_64__) && \
|
||||
!defined(__i386__) && !defined(__i386)
|
||||
# define cffi_read_barrier() __sync_synchronize()
|
||||
# else
|
||||
# define cffi_read_barrier() (void)0
|
||||
# endif
|
||||
#else
|
||||
/* --- Windows threads version --- */
|
||||
# include <Windows.h>
|
||||
# define cffi_compare_and_swap(l,o,n) \
|
||||
(InterlockedCompareExchangePointer(l,n,o) == (o))
|
||||
# define cffi_write_barrier() InterlockedCompareExchange(&_cffi_dummy,0,0)
|
||||
# define cffi_read_barrier() (void)0
|
||||
static volatile LONG _cffi_dummy;
|
||||
#endif
|
||||
|
||||
#ifdef WITH_THREAD
|
||||
# ifndef _MSC_VER
|
||||
# include <pthread.h>
|
||||
static pthread_mutex_t _cffi_embed_startup_lock;
|
||||
# else
|
||||
static CRITICAL_SECTION _cffi_embed_startup_lock;
|
||||
# endif
|
||||
static char _cffi_embed_startup_lock_ready = 0;
|
||||
#endif
|
||||
|
||||
static void _cffi_acquire_reentrant_mutex(void)
|
||||
{
|
||||
static void *volatile lock = NULL;
|
||||
|
||||
while (!cffi_compare_and_swap(&lock, NULL, (void *)1)) {
|
||||
/* should ideally do a spin loop instruction here, but
|
||||
hard to do it portably and doesn't really matter I
|
||||
think: pthread_mutex_init() should be very fast, and
|
||||
this is only run at start-up anyway. */
|
||||
}
|
||||
|
||||
#ifdef WITH_THREAD
|
||||
if (!_cffi_embed_startup_lock_ready) {
|
||||
# ifndef _MSC_VER
|
||||
pthread_mutexattr_t attr;
|
||||
pthread_mutexattr_init(&attr);
|
||||
pthread_mutexattr_settype(&attr, PTHREAD_MUTEX_RECURSIVE);
|
||||
pthread_mutex_init(&_cffi_embed_startup_lock, &attr);
|
||||
# else
|
||||
InitializeCriticalSection(&_cffi_embed_startup_lock);
|
||||
# endif
|
||||
_cffi_embed_startup_lock_ready = 1;
|
||||
}
|
||||
#endif
|
||||
|
||||
while (!cffi_compare_and_swap(&lock, (void *)1, NULL))
|
||||
;
|
||||
|
||||
#ifndef _MSC_VER
|
||||
pthread_mutex_lock(&_cffi_embed_startup_lock);
|
||||
#else
|
||||
EnterCriticalSection(&_cffi_embed_startup_lock);
|
||||
#endif
|
||||
}
|
||||
|
||||
static void _cffi_release_reentrant_mutex(void)
|
||||
{
|
||||
#ifndef _MSC_VER
|
||||
pthread_mutex_unlock(&_cffi_embed_startup_lock);
|
||||
#else
|
||||
LeaveCriticalSection(&_cffi_embed_startup_lock);
|
||||
#endif
|
||||
}
|
||||
|
||||
|
||||
/********** CPython-specific section **********/
|
||||
#ifndef PYPY_VERSION
|
||||
|
||||
#include "_cffi_errors.h"
|
||||
|
||||
|
||||
#define _cffi_call_python_org _cffi_exports[_CFFI_CPIDX]
|
||||
|
||||
PyMODINIT_FUNC _CFFI_PYTHON_STARTUP_FUNC(void); /* forward */
|
||||
|
||||
static void _cffi_py_initialize(void)
|
||||
{
|
||||
/* XXX use initsigs=0, which "skips initialization registration of
|
||||
signal handlers, which might be useful when Python is
|
||||
embedded" according to the Python docs. But review and think
|
||||
if it should be a user-controllable setting.
|
||||
|
||||
XXX we should also give a way to write errors to a buffer
|
||||
instead of to stderr.
|
||||
|
||||
XXX if importing 'site' fails, CPython (any version) calls
|
||||
exit(). Should we try to work around this behavior here?
|
||||
*/
|
||||
Py_InitializeEx(0);
|
||||
}
|
||||
|
||||
static int _cffi_initialize_python(void)
|
||||
{
|
||||
/* This initializes Python, imports _cffi_backend, and then the
|
||||
present .dll/.so is set up as a CPython C extension module.
|
||||
*/
|
||||
int result;
|
||||
PyGILState_STATE state;
|
||||
PyObject *pycode=NULL, *global_dict=NULL, *x;
|
||||
PyObject *builtins;
|
||||
|
||||
state = PyGILState_Ensure();
|
||||
|
||||
/* Call the initxxx() function from the present module. It will
|
||||
create and initialize us as a CPython extension module, instead
|
||||
of letting the startup Python code do it---it might reimport
|
||||
the same .dll/.so and get maybe confused on some platforms.
|
||||
It might also have troubles locating the .dll/.so again for all
|
||||
I know.
|
||||
*/
|
||||
(void)_CFFI_PYTHON_STARTUP_FUNC();
|
||||
if (PyErr_Occurred())
|
||||
goto error;
|
||||
|
||||
/* Now run the Python code provided to ffi.embedding_init_code().
|
||||
*/
|
||||
pycode = Py_CompileString(_CFFI_PYTHON_STARTUP_CODE,
|
||||
"<init code for '" _CFFI_MODULE_NAME "'>",
|
||||
Py_file_input);
|
||||
if (pycode == NULL)
|
||||
goto error;
|
||||
global_dict = PyDict_New();
|
||||
if (global_dict == NULL)
|
||||
goto error;
|
||||
builtins = PyEval_GetBuiltins();
|
||||
if (builtins == NULL)
|
||||
goto error;
|
||||
if (PyDict_SetItemString(global_dict, "__builtins__", builtins) < 0)
|
||||
goto error;
|
||||
x = PyEval_EvalCode(
|
||||
#if PY_MAJOR_VERSION < 3
|
||||
(PyCodeObject *)
|
||||
#endif
|
||||
pycode, global_dict, global_dict);
|
||||
if (x == NULL)
|
||||
goto error;
|
||||
Py_DECREF(x);
|
||||
|
||||
/* Done! Now if we've been called from
|
||||
_cffi_start_and_call_python() in an ``extern "Python"``, we can
|
||||
only hope that the Python code did correctly set up the
|
||||
corresponding @ffi.def_extern() function. Otherwise, the
|
||||
general logic of ``extern "Python"`` functions (inside the
|
||||
_cffi_backend module) will find that the reference is still
|
||||
missing and print an error.
|
||||
*/
|
||||
result = 0;
|
||||
done:
|
||||
Py_XDECREF(pycode);
|
||||
Py_XDECREF(global_dict);
|
||||
PyGILState_Release(state);
|
||||
return result;
|
||||
|
||||
error:;
|
||||
{
|
||||
/* Print as much information as potentially useful.
|
||||
Debugging load-time failures with embedding is not fun
|
||||
*/
|
||||
PyObject *ecap;
|
||||
PyObject *exception, *v, *tb, *f, *modules, *mod;
|
||||
PyErr_Fetch(&exception, &v, &tb);
|
||||
ecap = _cffi_start_error_capture();
|
||||
f = PySys_GetObject((char *)"stderr");
|
||||
if (f != NULL && f != Py_None) {
|
||||
PyFile_WriteString(
|
||||
"Failed to initialize the Python-CFFI embedding logic:\n\n", f);
|
||||
}
|
||||
|
||||
if (exception != NULL) {
|
||||
PyErr_NormalizeException(&exception, &v, &tb);
|
||||
PyErr_Display(exception, v, tb);
|
||||
}
|
||||
Py_XDECREF(exception);
|
||||
Py_XDECREF(v);
|
||||
Py_XDECREF(tb);
|
||||
|
||||
if (f != NULL && f != Py_None) {
|
||||
PyFile_WriteString("\nFrom: " _CFFI_MODULE_NAME
|
||||
"\ncompiled with cffi version: 2.0.0"
|
||||
"\n_cffi_backend module: ", f);
|
||||
modules = PyImport_GetModuleDict();
|
||||
mod = PyDict_GetItemString(modules, "_cffi_backend");
|
||||
if (mod == NULL) {
|
||||
PyFile_WriteString("not loaded", f);
|
||||
}
|
||||
else {
|
||||
v = PyObject_GetAttrString(mod, "__file__");
|
||||
PyFile_WriteObject(v, f, 0);
|
||||
Py_XDECREF(v);
|
||||
}
|
||||
PyFile_WriteString("\nsys.path: ", f);
|
||||
PyFile_WriteObject(PySys_GetObject((char *)"path"), f, 0);
|
||||
PyFile_WriteString("\n\n", f);
|
||||
}
|
||||
_cffi_stop_error_capture(ecap);
|
||||
}
|
||||
result = -1;
|
||||
goto done;
|
||||
}
|
||||
|
||||
#if PY_VERSION_HEX < 0x03080000
|
||||
PyAPI_DATA(char *) _PyParser_TokenNames[]; /* from CPython */
|
||||
#endif
|
||||
|
||||
static int _cffi_carefully_make_gil(void)
|
||||
{
|
||||
/* This does the basic initialization of Python. It can be called
|
||||
completely concurrently from unrelated threads. It assumes
|
||||
that we don't hold the GIL before (if it exists), and we don't
|
||||
hold it afterwards.
|
||||
|
||||
(What it really does used to be completely different in Python 2
|
||||
and Python 3, with the Python 2 solution avoiding the spin-lock
|
||||
around the Py_InitializeEx() call. However, after recent changes
|
||||
to CPython 2.7 (issue #358) it no longer works. So we use the
|
||||
Python 3 solution everywhere.)
|
||||
|
||||
This initializes Python by calling Py_InitializeEx().
|
||||
Important: this must not be called concurrently at all.
|
||||
So we use a global variable as a simple spin lock. This global
|
||||
variable must be from 'libpythonX.Y.so', not from this
|
||||
cffi-based extension module, because it must be shared from
|
||||
different cffi-based extension modules.
|
||||
|
||||
In Python < 3.8, we choose
|
||||
_PyParser_TokenNames[0] as a completely arbitrary pointer value
|
||||
that is never written to. The default is to point to the
|
||||
string "ENDMARKER". We change it temporarily to point to the
|
||||
next character in that string. (Yes, I know it's REALLY
|
||||
obscure.)
|
||||
|
||||
In Python >= 3.8, this string array is no longer writable, so
|
||||
instead we pick PyCapsuleType.tp_version_tag. We can't change
|
||||
Python < 3.8 because someone might use a mixture of cffi
|
||||
embedded modules, some of which were compiled before this file
|
||||
changed.
|
||||
|
||||
In Python >= 3.12, this stopped working because that particular
|
||||
tp_version_tag gets modified during interpreter startup. It's
|
||||
arguably a bad idea before 3.12 too, but again we can't change
|
||||
that because someone might use a mixture of cffi embedded
|
||||
modules, and no-one reported a bug so far. In Python >= 3.12
|
||||
we go instead for PyCapsuleType.tp_as_buffer, which is supposed
|
||||
to always be NULL. We write to it temporarily a pointer to
|
||||
a struct full of NULLs, which is semantically the same.
|
||||
*/
|
||||
|
||||
#ifdef WITH_THREAD
|
||||
# if PY_VERSION_HEX < 0x03080000
|
||||
char *volatile *lock = (char *volatile *)_PyParser_TokenNames;
|
||||
char *old_value, *locked_value;
|
||||
|
||||
while (1) { /* spin loop */
|
||||
old_value = *lock;
|
||||
locked_value = old_value + 1;
|
||||
if (old_value[0] == 'E') {
|
||||
assert(old_value[1] == 'N');
|
||||
if (cffi_compare_and_swap(lock, old_value, locked_value))
|
||||
break;
|
||||
}
|
||||
else {
|
||||
assert(old_value[0] == 'N');
|
||||
/* should ideally do a spin loop instruction here, but
|
||||
hard to do it portably and doesn't really matter I
|
||||
think: PyEval_InitThreads() should be very fast, and
|
||||
this is only run at start-up anyway. */
|
||||
}
|
||||
}
|
||||
# else
|
||||
# if PY_VERSION_HEX < 0x030C0000
|
||||
int volatile *lock = (int volatile *)&PyCapsule_Type.tp_version_tag;
|
||||
int old_value, locked_value = -42;
|
||||
assert(!(PyCapsule_Type.tp_flags & Py_TPFLAGS_HAVE_VERSION_TAG));
|
||||
# else
|
||||
static struct ebp_s { PyBufferProcs buf; int mark; } empty_buffer_procs;
|
||||
empty_buffer_procs.mark = -42;
|
||||
PyBufferProcs *volatile *lock = (PyBufferProcs *volatile *)
|
||||
&PyCapsule_Type.tp_as_buffer;
|
||||
PyBufferProcs *old_value, *locked_value = &empty_buffer_procs.buf;
|
||||
# endif
|
||||
|
||||
while (1) { /* spin loop */
|
||||
old_value = *lock;
|
||||
if (old_value == 0) {
|
||||
if (cffi_compare_and_swap(lock, old_value, locked_value))
|
||||
break;
|
||||
}
|
||||
else {
|
||||
# if PY_VERSION_HEX < 0x030C0000
|
||||
assert(old_value == locked_value);
|
||||
# else
|
||||
/* The pointer should point to a possibly different
|
||||
empty_buffer_procs from another C extension module */
|
||||
assert(((struct ebp_s *)old_value)->mark == -42);
|
||||
# endif
|
||||
/* should ideally do a spin loop instruction here, but
|
||||
hard to do it portably and doesn't really matter I
|
||||
think: PyEval_InitThreads() should be very fast, and
|
||||
this is only run at start-up anyway. */
|
||||
}
|
||||
}
|
||||
# endif
|
||||
#endif
|
||||
|
||||
/* call Py_InitializeEx() */
|
||||
if (!Py_IsInitialized()) {
|
||||
_cffi_py_initialize();
|
||||
#if PY_VERSION_HEX < 0x03070000
|
||||
PyEval_InitThreads();
|
||||
#endif
|
||||
PyEval_SaveThread(); /* release the GIL */
|
||||
/* the returned tstate must be the one that has been stored into the
|
||||
autoTLSkey by _PyGILState_Init() called from Py_Initialize(). */
|
||||
}
|
||||
else {
|
||||
#if PY_VERSION_HEX < 0x03070000
|
||||
/* PyEval_InitThreads() is always a no-op from CPython 3.7 */
|
||||
PyGILState_STATE state = PyGILState_Ensure();
|
||||
PyEval_InitThreads();
|
||||
PyGILState_Release(state);
|
||||
#endif
|
||||
}
|
||||
|
||||
#ifdef WITH_THREAD
|
||||
/* release the lock */
|
||||
while (!cffi_compare_and_swap(lock, locked_value, old_value))
|
||||
;
|
||||
#endif
|
||||
|
||||
return 0;
|
||||
}
|
||||
|
||||
/********** end CPython-specific section **********/
|
||||
|
||||
|
||||
#else
|
||||
|
||||
|
||||
/********** PyPy-specific section **********/
|
||||
|
||||
PyMODINIT_FUNC _CFFI_PYTHON_STARTUP_FUNC(const void *[]); /* forward */
|
||||
|
||||
static struct _cffi_pypy_init_s {
|
||||
const char *name;
|
||||
void *func; /* function pointer */
|
||||
const char *code;
|
||||
} _cffi_pypy_init = {
|
||||
_CFFI_MODULE_NAME,
|
||||
_CFFI_PYTHON_STARTUP_FUNC,
|
||||
_CFFI_PYTHON_STARTUP_CODE,
|
||||
};
|
||||
|
||||
extern int pypy_carefully_make_gil(const char *);
|
||||
extern int pypy_init_embedded_cffi_module(int, struct _cffi_pypy_init_s *);
|
||||
|
||||
static int _cffi_carefully_make_gil(void)
|
||||
{
|
||||
return pypy_carefully_make_gil(_CFFI_MODULE_NAME);
|
||||
}
|
||||
|
||||
static int _cffi_initialize_python(void)
|
||||
{
|
||||
return pypy_init_embedded_cffi_module(0xB011, &_cffi_pypy_init);
|
||||
}
|
||||
|
||||
/********** end PyPy-specific section **********/
|
||||
|
||||
|
||||
#endif
|
||||
|
||||
|
||||
#ifdef __GNUC__
|
||||
__attribute__((noinline))
|
||||
#endif
|
||||
static _cffi_call_python_fnptr _cffi_start_python(void)
|
||||
{
|
||||
/* Delicate logic to initialize Python. This function can be
|
||||
called multiple times concurrently, e.g. when the process calls
|
||||
its first ``extern "Python"`` functions in multiple threads at
|
||||
once. It can also be called recursively, in which case we must
|
||||
ignore it. We also have to consider what occurs if several
|
||||
different cffi-based extensions reach this code in parallel
|
||||
threads---it is a different copy of the code, then, and we
|
||||
can't have any shared global variable unless it comes from
|
||||
'libpythonX.Y.so'.
|
||||
|
||||
Idea:
|
||||
|
||||
* _cffi_carefully_make_gil(): "carefully" call
|
||||
PyEval_InitThreads() (possibly with Py_InitializeEx() first).
|
||||
|
||||
* then we use a (local) custom lock to make sure that a call to this
|
||||
cffi-based extension will wait if another call to the *same*
|
||||
extension is running the initialization in another thread.
|
||||
It is reentrant, so that a recursive call will not block, but
|
||||
only one from a different thread.
|
||||
|
||||
* then we grab the GIL and (Python 2) we call Py_InitializeEx().
|
||||
At this point, concurrent calls to Py_InitializeEx() are not
|
||||
possible: we have the GIL.
|
||||
|
||||
* do the rest of the specific initialization, which may
|
||||
temporarily release the GIL but not the custom lock.
|
||||
Only release the custom lock when we are done.
|
||||
*/
|
||||
static char called = 0;
|
||||
|
||||
if (_cffi_carefully_make_gil() != 0)
|
||||
return NULL;
|
||||
|
||||
_cffi_acquire_reentrant_mutex();
|
||||
|
||||
/* Here the GIL exists, but we don't have it. We're only protected
|
||||
from concurrency by the reentrant mutex. */
|
||||
|
||||
/* This file only initializes the embedded module once, the first
|
||||
time this is called, even if there are subinterpreters. */
|
||||
if (!called) {
|
||||
called = 1; /* invoke _cffi_initialize_python() only once,
|
||||
but don't set '_cffi_call_python' right now,
|
||||
otherwise concurrent threads won't call
|
||||
this function at all (we need them to wait) */
|
||||
if (_cffi_initialize_python() == 0) {
|
||||
/* now initialization is finished. Switch to the fast-path. */
|
||||
|
||||
/* We would like nobody to see the new value of
|
||||
'_cffi_call_python' without also seeing the rest of the
|
||||
data initialized. However, this is not possible. But
|
||||
the new value of '_cffi_call_python' is the function
|
||||
'cffi_call_python()' from _cffi_backend. So: */
|
||||
cffi_write_barrier();
|
||||
/* ^^^ we put a write barrier here, and a corresponding
|
||||
read barrier at the start of cffi_call_python(). This
|
||||
ensures that after that read barrier, we see everything
|
||||
done here before the write barrier.
|
||||
*/
|
||||
|
||||
assert(_cffi_call_python_org != NULL);
|
||||
_cffi_call_python = (_cffi_call_python_fnptr)_cffi_call_python_org;
|
||||
}
|
||||
else {
|
||||
/* initialization failed. Reset this to NULL, even if it was
|
||||
already set to some other value. Future calls to
|
||||
_cffi_start_python() are still forced to occur, and will
|
||||
always return NULL from now on. */
|
||||
_cffi_call_python_org = NULL;
|
||||
}
|
||||
}
|
||||
|
||||
_cffi_release_reentrant_mutex();
|
||||
|
||||
return (_cffi_call_python_fnptr)_cffi_call_python_org;
|
||||
}
|
||||
|
||||
static
|
||||
void _cffi_start_and_call_python(struct _cffi_externpy_s *externpy, char *args)
|
||||
{
|
||||
_cffi_call_python_fnptr fnptr;
|
||||
int current_err = errno;
|
||||
#ifdef _MSC_VER
|
||||
int current_lasterr = GetLastError();
|
||||
#endif
|
||||
fnptr = _cffi_start_python();
|
||||
if (fnptr == NULL) {
|
||||
fprintf(stderr, "function %s() called, but initialization code "
|
||||
"failed. Returning 0.\n", externpy->name);
|
||||
memset(args, 0, externpy->size_of_result);
|
||||
}
|
||||
#ifdef _MSC_VER
|
||||
SetLastError(current_lasterr);
|
||||
#endif
|
||||
errno = current_err;
|
||||
|
||||
if (fnptr != NULL)
|
||||
fnptr(externpy, args);
|
||||
}
|
||||
|
||||
|
||||
/* The cffi_start_python() function makes sure Python is initialized
|
||||
and our cffi module is set up. It can be called manually from the
|
||||
user C code. The same effect is obtained automatically from any
|
||||
dll-exported ``extern "Python"`` function. This function returns
|
||||
-1 if initialization failed, 0 if all is OK. */
|
||||
_CFFI_UNUSED_FN
|
||||
static int cffi_start_python(void)
|
||||
{
|
||||
if (_cffi_call_python == &_cffi_start_and_call_python) {
|
||||
if (_cffi_start_python() == NULL)
|
||||
return -1;
|
||||
}
|
||||
cffi_read_barrier();
|
||||
return 0;
|
||||
}
|
||||
|
||||
#undef cffi_compare_and_swap
|
||||
#undef cffi_write_barrier
|
||||
#undef cffi_read_barrier
|
||||
|
||||
#ifdef __cplusplus
|
||||
}
|
||||
#endif
|
||||
83
venv/lib/python3.12/site-packages/cffi/_imp_emulation.py
Normal file
83
venv/lib/python3.12/site-packages/cffi/_imp_emulation.py
Normal file
@ -0,0 +1,83 @@
|
||||
|
||||
try:
|
||||
# this works on Python < 3.12
|
||||
from imp import *
|
||||
|
||||
except ImportError:
|
||||
# this is a limited emulation for Python >= 3.12.
|
||||
# Note that this is used only for tests or for the old ffi.verify().
|
||||
# This is copied from the source code of Python 3.11.
|
||||
|
||||
from _imp import (acquire_lock, release_lock,
|
||||
is_builtin, is_frozen)
|
||||
|
||||
from importlib._bootstrap import _load
|
||||
|
||||
from importlib import machinery
|
||||
import os
|
||||
import sys
|
||||
import tokenize
|
||||
|
||||
SEARCH_ERROR = 0
|
||||
PY_SOURCE = 1
|
||||
PY_COMPILED = 2
|
||||
C_EXTENSION = 3
|
||||
PY_RESOURCE = 4
|
||||
PKG_DIRECTORY = 5
|
||||
C_BUILTIN = 6
|
||||
PY_FROZEN = 7
|
||||
PY_CODERESOURCE = 8
|
||||
IMP_HOOK = 9
|
||||
|
||||
def get_suffixes():
|
||||
extensions = [(s, 'rb', C_EXTENSION)
|
||||
for s in machinery.EXTENSION_SUFFIXES]
|
||||
source = [(s, 'r', PY_SOURCE) for s in machinery.SOURCE_SUFFIXES]
|
||||
bytecode = [(s, 'rb', PY_COMPILED) for s in machinery.BYTECODE_SUFFIXES]
|
||||
return extensions + source + bytecode
|
||||
|
||||
def find_module(name, path=None):
|
||||
if not isinstance(name, str):
|
||||
raise TypeError("'name' must be a str, not {}".format(type(name)))
|
||||
elif not isinstance(path, (type(None), list)):
|
||||
# Backwards-compatibility
|
||||
raise RuntimeError("'path' must be None or a list, "
|
||||
"not {}".format(type(path)))
|
||||
|
||||
if path is None:
|
||||
if is_builtin(name):
|
||||
return None, None, ('', '', C_BUILTIN)
|
||||
elif is_frozen(name):
|
||||
return None, None, ('', '', PY_FROZEN)
|
||||
else:
|
||||
path = sys.path
|
||||
|
||||
for entry in path:
|
||||
package_directory = os.path.join(entry, name)
|
||||
for suffix in ['.py', machinery.BYTECODE_SUFFIXES[0]]:
|
||||
package_file_name = '__init__' + suffix
|
||||
file_path = os.path.join(package_directory, package_file_name)
|
||||
if os.path.isfile(file_path):
|
||||
return None, package_directory, ('', '', PKG_DIRECTORY)
|
||||
for suffix, mode, type_ in get_suffixes():
|
||||
file_name = name + suffix
|
||||
file_path = os.path.join(entry, file_name)
|
||||
if os.path.isfile(file_path):
|
||||
break
|
||||
else:
|
||||
continue
|
||||
break # Break out of outer loop when breaking out of inner loop.
|
||||
else:
|
||||
raise ImportError(name, name=name)
|
||||
|
||||
encoding = None
|
||||
if 'b' not in mode:
|
||||
with open(file_path, 'rb') as file:
|
||||
encoding = tokenize.detect_encoding(file.readline)[0]
|
||||
file = open(file_path, mode, encoding=encoding)
|
||||
return file, file_path, (suffix, mode, type_)
|
||||
|
||||
def load_dynamic(name, path, file=None):
|
||||
loader = machinery.ExtensionFileLoader(name, path)
|
||||
spec = machinery.ModuleSpec(name=name, loader=loader, origin=path)
|
||||
return _load(spec)
|
||||
@ -0,0 +1,45 @@
|
||||
"""
|
||||
Temporary shim module to indirect the bits of distutils we need from setuptools/distutils while providing useful
|
||||
error messages beyond `No module named 'distutils' on Python >= 3.12, or when setuptools' vendored distutils is broken.
|
||||
|
||||
This is a compromise to avoid a hard-dep on setuptools for Python >= 3.12, since many users don't need runtime compilation support from CFFI.
|
||||
"""
|
||||
import sys
|
||||
|
||||
try:
|
||||
# import setuptools first; this is the most robust way to ensure its embedded distutils is available
|
||||
# (the .pth shim should usually work, but this is even more robust)
|
||||
import setuptools
|
||||
except Exception as ex:
|
||||
if sys.version_info >= (3, 12):
|
||||
# Python 3.12 has no built-in distutils to fall back on, so any import problem is fatal
|
||||
raise Exception("This CFFI feature requires setuptools on Python >= 3.12. The setuptools module is missing or non-functional.") from ex
|
||||
|
||||
# silently ignore on older Pythons (support fallback to stdlib distutils where available)
|
||||
else:
|
||||
del setuptools
|
||||
|
||||
try:
|
||||
# bring in just the bits of distutils we need, whether they really came from setuptools or stdlib-embedded distutils
|
||||
from distutils import log, sysconfig
|
||||
from distutils.ccompiler import CCompiler
|
||||
from distutils.command.build_ext import build_ext
|
||||
from distutils.core import Distribution, Extension
|
||||
from distutils.dir_util import mkpath
|
||||
from distutils.errors import DistutilsSetupError, CompileError, LinkError
|
||||
from distutils.log import set_threshold, set_verbosity
|
||||
|
||||
if sys.platform == 'win32':
|
||||
try:
|
||||
# FUTURE: msvc9compiler module was removed in setuptools 74; consider removing, as it's only used by an ancient patch in `recompiler`
|
||||
from distutils.msvc9compiler import MSVCCompiler
|
||||
except ImportError:
|
||||
MSVCCompiler = None
|
||||
except Exception as ex:
|
||||
if sys.version_info >= (3, 12):
|
||||
raise Exception("This CFFI feature requires setuptools on Python >= 3.12. Please install the setuptools package.") from ex
|
||||
|
||||
# anything older, just let the underlying distutils import error fly
|
||||
raise Exception("This CFFI feature requires distutils. Please install the distutils or setuptools package.") from ex
|
||||
|
||||
del sys
|
||||
967
venv/lib/python3.12/site-packages/cffi/api.py
Normal file
967
venv/lib/python3.12/site-packages/cffi/api.py
Normal file
@ -0,0 +1,967 @@
|
||||
import sys, types
|
||||
from .lock import allocate_lock
|
||||
from .error import CDefError
|
||||
from . import model
|
||||
|
||||
try:
|
||||
callable
|
||||
except NameError:
|
||||
# Python 3.1
|
||||
from collections import Callable
|
||||
callable = lambda x: isinstance(x, Callable)
|
||||
|
||||
try:
|
||||
basestring
|
||||
except NameError:
|
||||
# Python 3.x
|
||||
basestring = str
|
||||
|
||||
_unspecified = object()
|
||||
|
||||
|
||||
|
||||
class FFI(object):
|
||||
r'''
|
||||
The main top-level class that you instantiate once, or once per module.
|
||||
|
||||
Example usage:
|
||||
|
||||
ffi = FFI()
|
||||
ffi.cdef("""
|
||||
int printf(const char *, ...);
|
||||
""")
|
||||
|
||||
C = ffi.dlopen(None) # standard library
|
||||
-or-
|
||||
C = ffi.verify() # use a C compiler: verify the decl above is right
|
||||
|
||||
C.printf("hello, %s!\n", ffi.new("char[]", "world"))
|
||||
'''
|
||||
|
||||
def __init__(self, backend=None):
|
||||
"""Create an FFI instance. The 'backend' argument is used to
|
||||
select a non-default backend, mostly for tests.
|
||||
"""
|
||||
if backend is None:
|
||||
# You need PyPy (>= 2.0 beta), or a CPython (>= 2.6) with
|
||||
# _cffi_backend.so compiled.
|
||||
import _cffi_backend as backend
|
||||
from . import __version__
|
||||
if backend.__version__ != __version__:
|
||||
# bad version! Try to be as explicit as possible.
|
||||
if hasattr(backend, '__file__'):
|
||||
# CPython
|
||||
raise Exception("Version mismatch: this is the 'cffi' package version %s, located in %r. When we import the top-level '_cffi_backend' extension module, we get version %s, located in %r. The two versions should be equal; check your installation." % (
|
||||
__version__, __file__,
|
||||
backend.__version__, backend.__file__))
|
||||
else:
|
||||
# PyPy
|
||||
raise Exception("Version mismatch: this is the 'cffi' package version %s, located in %r. This interpreter comes with a built-in '_cffi_backend' module, which is version %s. The two versions should be equal; check your installation." % (
|
||||
__version__, __file__, backend.__version__))
|
||||
# (If you insist you can also try to pass the option
|
||||
# 'backend=backend_ctypes.CTypesBackend()', but don't
|
||||
# rely on it! It's probably not going to work well.)
|
||||
|
||||
from . import cparser
|
||||
self._backend = backend
|
||||
self._lock = allocate_lock()
|
||||
self._parser = cparser.Parser()
|
||||
self._cached_btypes = {}
|
||||
self._parsed_types = types.ModuleType('parsed_types').__dict__
|
||||
self._new_types = types.ModuleType('new_types').__dict__
|
||||
self._function_caches = []
|
||||
self._libraries = []
|
||||
self._cdefsources = []
|
||||
self._included_ffis = []
|
||||
self._windows_unicode = None
|
||||
self._init_once_cache = {}
|
||||
self._cdef_version = None
|
||||
self._embedding = None
|
||||
self._typecache = model.get_typecache(backend)
|
||||
if hasattr(backend, 'set_ffi'):
|
||||
backend.set_ffi(self)
|
||||
for name in list(backend.__dict__):
|
||||
if name.startswith('RTLD_'):
|
||||
setattr(self, name, getattr(backend, name))
|
||||
#
|
||||
with self._lock:
|
||||
self.BVoidP = self._get_cached_btype(model.voidp_type)
|
||||
self.BCharA = self._get_cached_btype(model.char_array_type)
|
||||
if isinstance(backend, types.ModuleType):
|
||||
# _cffi_backend: attach these constants to the class
|
||||
if not hasattr(FFI, 'NULL'):
|
||||
FFI.NULL = self.cast(self.BVoidP, 0)
|
||||
FFI.CData, FFI.CType = backend._get_types()
|
||||
else:
|
||||
# ctypes backend: attach these constants to the instance
|
||||
self.NULL = self.cast(self.BVoidP, 0)
|
||||
self.CData, self.CType = backend._get_types()
|
||||
self.buffer = backend.buffer
|
||||
|
||||
def cdef(self, csource, override=False, packed=False, pack=None):
|
||||
"""Parse the given C source. This registers all declared functions,
|
||||
types, and global variables. The functions and global variables can
|
||||
then be accessed via either 'ffi.dlopen()' or 'ffi.verify()'.
|
||||
The types can be used in 'ffi.new()' and other functions.
|
||||
If 'packed' is specified as True, all structs declared inside this
|
||||
cdef are packed, i.e. laid out without any field alignment at all.
|
||||
Alternatively, 'pack' can be a small integer, and requests for
|
||||
alignment greater than that are ignored (pack=1 is equivalent to
|
||||
packed=True).
|
||||
"""
|
||||
self._cdef(csource, override=override, packed=packed, pack=pack)
|
||||
|
||||
def embedding_api(self, csource, packed=False, pack=None):
|
||||
self._cdef(csource, packed=packed, pack=pack, dllexport=True)
|
||||
if self._embedding is None:
|
||||
self._embedding = ''
|
||||
|
||||
def _cdef(self, csource, override=False, **options):
|
||||
if not isinstance(csource, str): # unicode, on Python 2
|
||||
if not isinstance(csource, basestring):
|
||||
raise TypeError("cdef() argument must be a string")
|
||||
csource = csource.encode('ascii')
|
||||
with self._lock:
|
||||
self._cdef_version = object()
|
||||
self._parser.parse(csource, override=override, **options)
|
||||
self._cdefsources.append(csource)
|
||||
if override:
|
||||
for cache in self._function_caches:
|
||||
cache.clear()
|
||||
finishlist = self._parser._recomplete
|
||||
if finishlist:
|
||||
self._parser._recomplete = []
|
||||
for tp in finishlist:
|
||||
tp.finish_backend_type(self, finishlist)
|
||||
|
||||
def dlopen(self, name, flags=0):
|
||||
"""Load and return a dynamic library identified by 'name'.
|
||||
The standard C library can be loaded by passing None.
|
||||
Note that functions and types declared by 'ffi.cdef()' are not
|
||||
linked to a particular library, just like C headers; in the
|
||||
library we only look for the actual (untyped) symbols.
|
||||
"""
|
||||
if not (isinstance(name, basestring) or
|
||||
name is None or
|
||||
isinstance(name, self.CData)):
|
||||
raise TypeError("dlopen(name): name must be a file name, None, "
|
||||
"or an already-opened 'void *' handle")
|
||||
with self._lock:
|
||||
lib, function_cache = _make_ffi_library(self, name, flags)
|
||||
self._function_caches.append(function_cache)
|
||||
self._libraries.append(lib)
|
||||
return lib
|
||||
|
||||
def dlclose(self, lib):
|
||||
"""Close a library obtained with ffi.dlopen(). After this call,
|
||||
access to functions or variables from the library will fail
|
||||
(possibly with a segmentation fault).
|
||||
"""
|
||||
type(lib).__cffi_close__(lib)
|
||||
|
||||
def _typeof_locked(self, cdecl):
|
||||
# call me with the lock!
|
||||
key = cdecl
|
||||
if key in self._parsed_types:
|
||||
return self._parsed_types[key]
|
||||
#
|
||||
if not isinstance(cdecl, str): # unicode, on Python 2
|
||||
cdecl = cdecl.encode('ascii')
|
||||
#
|
||||
type = self._parser.parse_type(cdecl)
|
||||
really_a_function_type = type.is_raw_function
|
||||
if really_a_function_type:
|
||||
type = type.as_function_pointer()
|
||||
btype = self._get_cached_btype(type)
|
||||
result = btype, really_a_function_type
|
||||
self._parsed_types[key] = result
|
||||
return result
|
||||
|
||||
def _typeof(self, cdecl, consider_function_as_funcptr=False):
|
||||
# string -> ctype object
|
||||
try:
|
||||
result = self._parsed_types[cdecl]
|
||||
except KeyError:
|
||||
with self._lock:
|
||||
result = self._typeof_locked(cdecl)
|
||||
#
|
||||
btype, really_a_function_type = result
|
||||
if really_a_function_type and not consider_function_as_funcptr:
|
||||
raise CDefError("the type %r is a function type, not a "
|
||||
"pointer-to-function type" % (cdecl,))
|
||||
return btype
|
||||
|
||||
def typeof(self, cdecl):
|
||||
"""Parse the C type given as a string and return the
|
||||
corresponding <ctype> object.
|
||||
It can also be used on 'cdata' instance to get its C type.
|
||||
"""
|
||||
if isinstance(cdecl, basestring):
|
||||
return self._typeof(cdecl)
|
||||
if isinstance(cdecl, self.CData):
|
||||
return self._backend.typeof(cdecl)
|
||||
if isinstance(cdecl, types.BuiltinFunctionType):
|
||||
res = _builtin_function_type(cdecl)
|
||||
if res is not None:
|
||||
return res
|
||||
if (isinstance(cdecl, types.FunctionType)
|
||||
and hasattr(cdecl, '_cffi_base_type')):
|
||||
with self._lock:
|
||||
return self._get_cached_btype(cdecl._cffi_base_type)
|
||||
raise TypeError(type(cdecl))
|
||||
|
||||
def sizeof(self, cdecl):
|
||||
"""Return the size in bytes of the argument. It can be a
|
||||
string naming a C type, or a 'cdata' instance.
|
||||
"""
|
||||
if isinstance(cdecl, basestring):
|
||||
BType = self._typeof(cdecl)
|
||||
return self._backend.sizeof(BType)
|
||||
else:
|
||||
return self._backend.sizeof(cdecl)
|
||||
|
||||
def alignof(self, cdecl):
|
||||
"""Return the natural alignment size in bytes of the C type
|
||||
given as a string.
|
||||
"""
|
||||
if isinstance(cdecl, basestring):
|
||||
cdecl = self._typeof(cdecl)
|
||||
return self._backend.alignof(cdecl)
|
||||
|
||||
def offsetof(self, cdecl, *fields_or_indexes):
|
||||
"""Return the offset of the named field inside the given
|
||||
structure or array, which must be given as a C type name.
|
||||
You can give several field names in case of nested structures.
|
||||
You can also give numeric values which correspond to array
|
||||
items, in case of an array type.
|
||||
"""
|
||||
if isinstance(cdecl, basestring):
|
||||
cdecl = self._typeof(cdecl)
|
||||
return self._typeoffsetof(cdecl, *fields_or_indexes)[1]
|
||||
|
||||
def new(self, cdecl, init=None):
|
||||
"""Allocate an instance according to the specified C type and
|
||||
return a pointer to it. The specified C type must be either a
|
||||
pointer or an array: ``new('X *')`` allocates an X and returns
|
||||
a pointer to it, whereas ``new('X[n]')`` allocates an array of
|
||||
n X'es and returns an array referencing it (which works
|
||||
mostly like a pointer, like in C). You can also use
|
||||
``new('X[]', n)`` to allocate an array of a non-constant
|
||||
length n.
|
||||
|
||||
The memory is initialized following the rules of declaring a
|
||||
global variable in C: by default it is zero-initialized, but
|
||||
an explicit initializer can be given which can be used to
|
||||
fill all or part of the memory.
|
||||
|
||||
When the returned <cdata> object goes out of scope, the memory
|
||||
is freed. In other words the returned <cdata> object has
|
||||
ownership of the value of type 'cdecl' that it points to. This
|
||||
means that the raw data can be used as long as this object is
|
||||
kept alive, but must not be used for a longer time. Be careful
|
||||
about that when copying the pointer to the memory somewhere
|
||||
else, e.g. into another structure.
|
||||
"""
|
||||
if isinstance(cdecl, basestring):
|
||||
cdecl = self._typeof(cdecl)
|
||||
return self._backend.newp(cdecl, init)
|
||||
|
||||
def new_allocator(self, alloc=None, free=None,
|
||||
should_clear_after_alloc=True):
|
||||
"""Return a new allocator, i.e. a function that behaves like ffi.new()
|
||||
but uses the provided low-level 'alloc' and 'free' functions.
|
||||
|
||||
'alloc' is called with the size as argument. If it returns NULL, a
|
||||
MemoryError is raised. 'free' is called with the result of 'alloc'
|
||||
as argument. Both can be either Python function or directly C
|
||||
functions. If 'free' is None, then no free function is called.
|
||||
If both 'alloc' and 'free' are None, the default is used.
|
||||
|
||||
If 'should_clear_after_alloc' is set to False, then the memory
|
||||
returned by 'alloc' is assumed to be already cleared (or you are
|
||||
fine with garbage); otherwise CFFI will clear it.
|
||||
"""
|
||||
compiled_ffi = self._backend.FFI()
|
||||
allocator = compiled_ffi.new_allocator(alloc, free,
|
||||
should_clear_after_alloc)
|
||||
def allocate(cdecl, init=None):
|
||||
if isinstance(cdecl, basestring):
|
||||
cdecl = self._typeof(cdecl)
|
||||
return allocator(cdecl, init)
|
||||
return allocate
|
||||
|
||||
def cast(self, cdecl, source):
|
||||
"""Similar to a C cast: returns an instance of the named C
|
||||
type initialized with the given 'source'. The source is
|
||||
casted between integers or pointers of any type.
|
||||
"""
|
||||
if isinstance(cdecl, basestring):
|
||||
cdecl = self._typeof(cdecl)
|
||||
return self._backend.cast(cdecl, source)
|
||||
|
||||
def string(self, cdata, maxlen=-1):
|
||||
"""Return a Python string (or unicode string) from the 'cdata'.
|
||||
If 'cdata' is a pointer or array of characters or bytes, returns
|
||||
the null-terminated string. The returned string extends until
|
||||
the first null character, or at most 'maxlen' characters. If
|
||||
'cdata' is an array then 'maxlen' defaults to its length.
|
||||
|
||||
If 'cdata' is a pointer or array of wchar_t, returns a unicode
|
||||
string following the same rules.
|
||||
|
||||
If 'cdata' is a single character or byte or a wchar_t, returns
|
||||
it as a string or unicode string.
|
||||
|
||||
If 'cdata' is an enum, returns the value of the enumerator as a
|
||||
string, or 'NUMBER' if the value is out of range.
|
||||
"""
|
||||
return self._backend.string(cdata, maxlen)
|
||||
|
||||
def unpack(self, cdata, length):
|
||||
"""Unpack an array of C data of the given length,
|
||||
returning a Python string/unicode/list.
|
||||
|
||||
If 'cdata' is a pointer to 'char', returns a byte string.
|
||||
It does not stop at the first null. This is equivalent to:
|
||||
ffi.buffer(cdata, length)[:]
|
||||
|
||||
If 'cdata' is a pointer to 'wchar_t', returns a unicode string.
|
||||
'length' is measured in wchar_t's; it is not the size in bytes.
|
||||
|
||||
If 'cdata' is a pointer to anything else, returns a list of
|
||||
'length' items. This is a faster equivalent to:
|
||||
[cdata[i] for i in range(length)]
|
||||
"""
|
||||
return self._backend.unpack(cdata, length)
|
||||
|
||||
#def buffer(self, cdata, size=-1):
|
||||
# """Return a read-write buffer object that references the raw C data
|
||||
# pointed to by the given 'cdata'. The 'cdata' must be a pointer or
|
||||
# an array. Can be passed to functions expecting a buffer, or directly
|
||||
# manipulated with:
|
||||
#
|
||||
# buf[:] get a copy of it in a regular string, or
|
||||
# buf[idx] as a single character
|
||||
# buf[:] = ...
|
||||
# buf[idx] = ... change the content
|
||||
# """
|
||||
# note that 'buffer' is a type, set on this instance by __init__
|
||||
|
||||
def from_buffer(self, cdecl, python_buffer=_unspecified,
|
||||
require_writable=False):
|
||||
"""Return a cdata of the given type pointing to the data of the
|
||||
given Python object, which must support the buffer interface.
|
||||
Note that this is not meant to be used on the built-in types
|
||||
str or unicode (you can build 'char[]' arrays explicitly)
|
||||
but only on objects containing large quantities of raw data
|
||||
in some other format, like 'array.array' or numpy arrays.
|
||||
|
||||
The first argument is optional and default to 'char[]'.
|
||||
"""
|
||||
if python_buffer is _unspecified:
|
||||
cdecl, python_buffer = self.BCharA, cdecl
|
||||
elif isinstance(cdecl, basestring):
|
||||
cdecl = self._typeof(cdecl)
|
||||
return self._backend.from_buffer(cdecl, python_buffer,
|
||||
require_writable)
|
||||
|
||||
def memmove(self, dest, src, n):
|
||||
"""ffi.memmove(dest, src, n) copies n bytes of memory from src to dest.
|
||||
|
||||
Like the C function memmove(), the memory areas may overlap;
|
||||
apart from that it behaves like the C function memcpy().
|
||||
|
||||
'src' can be any cdata ptr or array, or any Python buffer object.
|
||||
'dest' can be any cdata ptr or array, or a writable Python buffer
|
||||
object. The size to copy, 'n', is always measured in bytes.
|
||||
|
||||
Unlike other methods, this one supports all Python buffer including
|
||||
byte strings and bytearrays---but it still does not support
|
||||
non-contiguous buffers.
|
||||
"""
|
||||
return self._backend.memmove(dest, src, n)
|
||||
|
||||
def callback(self, cdecl, python_callable=None, error=None, onerror=None):
|
||||
"""Return a callback object or a decorator making such a
|
||||
callback object. 'cdecl' must name a C function pointer type.
|
||||
The callback invokes the specified 'python_callable' (which may
|
||||
be provided either directly or via a decorator). Important: the
|
||||
callback object must be manually kept alive for as long as the
|
||||
callback may be invoked from the C level.
|
||||
"""
|
||||
def callback_decorator_wrap(python_callable):
|
||||
if not callable(python_callable):
|
||||
raise TypeError("the 'python_callable' argument "
|
||||
"is not callable")
|
||||
return self._backend.callback(cdecl, python_callable,
|
||||
error, onerror)
|
||||
if isinstance(cdecl, basestring):
|
||||
cdecl = self._typeof(cdecl, consider_function_as_funcptr=True)
|
||||
if python_callable is None:
|
||||
return callback_decorator_wrap # decorator mode
|
||||
else:
|
||||
return callback_decorator_wrap(python_callable) # direct mode
|
||||
|
||||
def getctype(self, cdecl, replace_with=''):
|
||||
"""Return a string giving the C type 'cdecl', which may be itself
|
||||
a string or a <ctype> object. If 'replace_with' is given, it gives
|
||||
extra text to append (or insert for more complicated C types), like
|
||||
a variable name, or '*' to get actually the C type 'pointer-to-cdecl'.
|
||||
"""
|
||||
if isinstance(cdecl, basestring):
|
||||
cdecl = self._typeof(cdecl)
|
||||
replace_with = replace_with.strip()
|
||||
if (replace_with.startswith('*')
|
||||
and '&[' in self._backend.getcname(cdecl, '&')):
|
||||
replace_with = '(%s)' % replace_with
|
||||
elif replace_with and not replace_with[0] in '[(':
|
||||
replace_with = ' ' + replace_with
|
||||
return self._backend.getcname(cdecl, replace_with)
|
||||
|
||||
def gc(self, cdata, destructor, size=0):
|
||||
"""Return a new cdata object that points to the same
|
||||
data. Later, when this new cdata object is garbage-collected,
|
||||
'destructor(old_cdata_object)' will be called.
|
||||
|
||||
The optional 'size' gives an estimate of the size, used to
|
||||
trigger the garbage collection more eagerly. So far only used
|
||||
on PyPy. It tells the GC that the returned object keeps alive
|
||||
roughly 'size' bytes of external memory.
|
||||
"""
|
||||
return self._backend.gcp(cdata, destructor, size)
|
||||
|
||||
def _get_cached_btype(self, type):
|
||||
assert self._lock.acquire(False) is False
|
||||
# call me with the lock!
|
||||
try:
|
||||
BType = self._cached_btypes[type]
|
||||
except KeyError:
|
||||
finishlist = []
|
||||
BType = type.get_cached_btype(self, finishlist)
|
||||
for type in finishlist:
|
||||
type.finish_backend_type(self, finishlist)
|
||||
return BType
|
||||
|
||||
def verify(self, source='', tmpdir=None, **kwargs):
|
||||
"""Verify that the current ffi signatures compile on this
|
||||
machine, and return a dynamic library object. The dynamic
|
||||
library can be used to call functions and access global
|
||||
variables declared in this 'ffi'. The library is compiled
|
||||
by the C compiler: it gives you C-level API compatibility
|
||||
(including calling macros). This is unlike 'ffi.dlopen()',
|
||||
which requires binary compatibility in the signatures.
|
||||
"""
|
||||
from .verifier import Verifier, _caller_dir_pycache
|
||||
#
|
||||
# If set_unicode(True) was called, insert the UNICODE and
|
||||
# _UNICODE macro declarations
|
||||
if self._windows_unicode:
|
||||
self._apply_windows_unicode(kwargs)
|
||||
#
|
||||
# Set the tmpdir here, and not in Verifier.__init__: it picks
|
||||
# up the caller's directory, which we want to be the caller of
|
||||
# ffi.verify(), as opposed to the caller of Veritier().
|
||||
tmpdir = tmpdir or _caller_dir_pycache()
|
||||
#
|
||||
# Make a Verifier() and use it to load the library.
|
||||
self.verifier = Verifier(self, source, tmpdir, **kwargs)
|
||||
lib = self.verifier.load_library()
|
||||
#
|
||||
# Save the loaded library for keep-alive purposes, even
|
||||
# if the caller doesn't keep it alive itself (it should).
|
||||
self._libraries.append(lib)
|
||||
return lib
|
||||
|
||||
def _get_errno(self):
|
||||
return self._backend.get_errno()
|
||||
def _set_errno(self, errno):
|
||||
self._backend.set_errno(errno)
|
||||
errno = property(_get_errno, _set_errno, None,
|
||||
"the value of 'errno' from/to the C calls")
|
||||
|
||||
def getwinerror(self, code=-1):
|
||||
return self._backend.getwinerror(code)
|
||||
|
||||
def _pointer_to(self, ctype):
|
||||
with self._lock:
|
||||
return model.pointer_cache(self, ctype)
|
||||
|
||||
def addressof(self, cdata, *fields_or_indexes):
|
||||
"""Return the address of a <cdata 'struct-or-union'>.
|
||||
If 'fields_or_indexes' are given, returns the address of that
|
||||
field or array item in the structure or array, recursively in
|
||||
case of nested structures.
|
||||
"""
|
||||
try:
|
||||
ctype = self._backend.typeof(cdata)
|
||||
except TypeError:
|
||||
if '__addressof__' in type(cdata).__dict__:
|
||||
return type(cdata).__addressof__(cdata, *fields_or_indexes)
|
||||
raise
|
||||
if fields_or_indexes:
|
||||
ctype, offset = self._typeoffsetof(ctype, *fields_or_indexes)
|
||||
else:
|
||||
if ctype.kind == "pointer":
|
||||
raise TypeError("addressof(pointer)")
|
||||
offset = 0
|
||||
ctypeptr = self._pointer_to(ctype)
|
||||
return self._backend.rawaddressof(ctypeptr, cdata, offset)
|
||||
|
||||
def _typeoffsetof(self, ctype, field_or_index, *fields_or_indexes):
|
||||
ctype, offset = self._backend.typeoffsetof(ctype, field_or_index)
|
||||
for field1 in fields_or_indexes:
|
||||
ctype, offset1 = self._backend.typeoffsetof(ctype, field1, 1)
|
||||
offset += offset1
|
||||
return ctype, offset
|
||||
|
||||
def include(self, ffi_to_include):
|
||||
"""Includes the typedefs, structs, unions and enums defined
|
||||
in another FFI instance. Usage is similar to a #include in C,
|
||||
where a part of the program might include types defined in
|
||||
another part for its own usage. Note that the include()
|
||||
method has no effect on functions, constants and global
|
||||
variables, which must anyway be accessed directly from the
|
||||
lib object returned by the original FFI instance.
|
||||
"""
|
||||
if not isinstance(ffi_to_include, FFI):
|
||||
raise TypeError("ffi.include() expects an argument that is also of"
|
||||
" type cffi.FFI, not %r" % (
|
||||
type(ffi_to_include).__name__,))
|
||||
if ffi_to_include is self:
|
||||
raise ValueError("self.include(self)")
|
||||
with ffi_to_include._lock:
|
||||
with self._lock:
|
||||
self._parser.include(ffi_to_include._parser)
|
||||
self._cdefsources.append('[')
|
||||
self._cdefsources.extend(ffi_to_include._cdefsources)
|
||||
self._cdefsources.append(']')
|
||||
self._included_ffis.append(ffi_to_include)
|
||||
|
||||
def new_handle(self, x):
|
||||
return self._backend.newp_handle(self.BVoidP, x)
|
||||
|
||||
def from_handle(self, x):
|
||||
return self._backend.from_handle(x)
|
||||
|
||||
def release(self, x):
|
||||
self._backend.release(x)
|
||||
|
||||
def set_unicode(self, enabled_flag):
|
||||
"""Windows: if 'enabled_flag' is True, enable the UNICODE and
|
||||
_UNICODE defines in C, and declare the types like TCHAR and LPTCSTR
|
||||
to be (pointers to) wchar_t. If 'enabled_flag' is False,
|
||||
declare these types to be (pointers to) plain 8-bit characters.
|
||||
This is mostly for backward compatibility; you usually want True.
|
||||
"""
|
||||
if self._windows_unicode is not None:
|
||||
raise ValueError("set_unicode() can only be called once")
|
||||
enabled_flag = bool(enabled_flag)
|
||||
if enabled_flag:
|
||||
self.cdef("typedef wchar_t TBYTE;"
|
||||
"typedef wchar_t TCHAR;"
|
||||
"typedef const wchar_t *LPCTSTR;"
|
||||
"typedef const wchar_t *PCTSTR;"
|
||||
"typedef wchar_t *LPTSTR;"
|
||||
"typedef wchar_t *PTSTR;"
|
||||
"typedef TBYTE *PTBYTE;"
|
||||
"typedef TCHAR *PTCHAR;")
|
||||
else:
|
||||
self.cdef("typedef char TBYTE;"
|
||||
"typedef char TCHAR;"
|
||||
"typedef const char *LPCTSTR;"
|
||||
"typedef const char *PCTSTR;"
|
||||
"typedef char *LPTSTR;"
|
||||
"typedef char *PTSTR;"
|
||||
"typedef TBYTE *PTBYTE;"
|
||||
"typedef TCHAR *PTCHAR;")
|
||||
self._windows_unicode = enabled_flag
|
||||
|
||||
def _apply_windows_unicode(self, kwds):
|
||||
defmacros = kwds.get('define_macros', ())
|
||||
if not isinstance(defmacros, (list, tuple)):
|
||||
raise TypeError("'define_macros' must be a list or tuple")
|
||||
defmacros = list(defmacros) + [('UNICODE', '1'),
|
||||
('_UNICODE', '1')]
|
||||
kwds['define_macros'] = defmacros
|
||||
|
||||
def _apply_embedding_fix(self, kwds):
|
||||
# must include an argument like "-lpython2.7" for the compiler
|
||||
def ensure(key, value):
|
||||
lst = kwds.setdefault(key, [])
|
||||
if value not in lst:
|
||||
lst.append(value)
|
||||
#
|
||||
if '__pypy__' in sys.builtin_module_names:
|
||||
import os
|
||||
if sys.platform == "win32":
|
||||
# we need 'libpypy-c.lib'. Current distributions of
|
||||
# pypy (>= 4.1) contain it as 'libs/python27.lib'.
|
||||
pythonlib = "python{0[0]}{0[1]}".format(sys.version_info)
|
||||
if hasattr(sys, 'prefix'):
|
||||
ensure('library_dirs', os.path.join(sys.prefix, 'libs'))
|
||||
else:
|
||||
# we need 'libpypy-c.{so,dylib}', which should be by
|
||||
# default located in 'sys.prefix/bin' for installed
|
||||
# systems.
|
||||
if sys.version_info < (3,):
|
||||
pythonlib = "pypy-c"
|
||||
else:
|
||||
pythonlib = "pypy3-c"
|
||||
if hasattr(sys, 'prefix'):
|
||||
ensure('library_dirs', os.path.join(sys.prefix, 'bin'))
|
||||
# On uninstalled pypy's, the libpypy-c is typically found in
|
||||
# .../pypy/goal/.
|
||||
if hasattr(sys, 'prefix'):
|
||||
ensure('library_dirs', os.path.join(sys.prefix, 'pypy', 'goal'))
|
||||
else:
|
||||
if sys.platform == "win32":
|
||||
template = "python%d%d"
|
||||
if hasattr(sys, 'gettotalrefcount'):
|
||||
template += '_d'
|
||||
else:
|
||||
try:
|
||||
import sysconfig
|
||||
except ImportError: # 2.6
|
||||
from cffi._shimmed_dist_utils import sysconfig
|
||||
template = "python%d.%d"
|
||||
if sysconfig.get_config_var('DEBUG_EXT'):
|
||||
template += sysconfig.get_config_var('DEBUG_EXT')
|
||||
pythonlib = (template %
|
||||
(sys.hexversion >> 24, (sys.hexversion >> 16) & 0xff))
|
||||
if hasattr(sys, 'abiflags'):
|
||||
pythonlib += sys.abiflags
|
||||
ensure('libraries', pythonlib)
|
||||
if sys.platform == "win32":
|
||||
ensure('extra_link_args', '/MANIFEST')
|
||||
|
||||
def set_source(self, module_name, source, source_extension='.c', **kwds):
|
||||
import os
|
||||
if hasattr(self, '_assigned_source'):
|
||||
raise ValueError("set_source() cannot be called several times "
|
||||
"per ffi object")
|
||||
if not isinstance(module_name, basestring):
|
||||
raise TypeError("'module_name' must be a string")
|
||||
if os.sep in module_name or (os.altsep and os.altsep in module_name):
|
||||
raise ValueError("'module_name' must not contain '/': use a dotted "
|
||||
"name to make a 'package.module' location")
|
||||
self._assigned_source = (str(module_name), source,
|
||||
source_extension, kwds)
|
||||
|
||||
def set_source_pkgconfig(self, module_name, pkgconfig_libs, source,
|
||||
source_extension='.c', **kwds):
|
||||
from . import pkgconfig
|
||||
if not isinstance(pkgconfig_libs, list):
|
||||
raise TypeError("the pkgconfig_libs argument must be a list "
|
||||
"of package names")
|
||||
kwds2 = pkgconfig.flags_from_pkgconfig(pkgconfig_libs)
|
||||
pkgconfig.merge_flags(kwds, kwds2)
|
||||
self.set_source(module_name, source, source_extension, **kwds)
|
||||
|
||||
def distutils_extension(self, tmpdir='build', verbose=True):
|
||||
from cffi._shimmed_dist_utils import mkpath
|
||||
from .recompiler import recompile
|
||||
#
|
||||
if not hasattr(self, '_assigned_source'):
|
||||
if hasattr(self, 'verifier'): # fallback, 'tmpdir' ignored
|
||||
return self.verifier.get_extension()
|
||||
raise ValueError("set_source() must be called before"
|
||||
" distutils_extension()")
|
||||
module_name, source, source_extension, kwds = self._assigned_source
|
||||
if source is None:
|
||||
raise TypeError("distutils_extension() is only for C extension "
|
||||
"modules, not for dlopen()-style pure Python "
|
||||
"modules")
|
||||
mkpath(tmpdir)
|
||||
ext, updated = recompile(self, module_name,
|
||||
source, tmpdir=tmpdir, extradir=tmpdir,
|
||||
source_extension=source_extension,
|
||||
call_c_compiler=False, **kwds)
|
||||
if verbose:
|
||||
if updated:
|
||||
sys.stderr.write("regenerated: %r\n" % (ext.sources[0],))
|
||||
else:
|
||||
sys.stderr.write("not modified: %r\n" % (ext.sources[0],))
|
||||
return ext
|
||||
|
||||
def emit_c_code(self, filename):
|
||||
from .recompiler import recompile
|
||||
#
|
||||
if not hasattr(self, '_assigned_source'):
|
||||
raise ValueError("set_source() must be called before emit_c_code()")
|
||||
module_name, source, source_extension, kwds = self._assigned_source
|
||||
if source is None:
|
||||
raise TypeError("emit_c_code() is only for C extension modules, "
|
||||
"not for dlopen()-style pure Python modules")
|
||||
recompile(self, module_name, source,
|
||||
c_file=filename, call_c_compiler=False,
|
||||
uses_ffiplatform=False, **kwds)
|
||||
|
||||
def emit_python_code(self, filename):
|
||||
from .recompiler import recompile
|
||||
#
|
||||
if not hasattr(self, '_assigned_source'):
|
||||
raise ValueError("set_source() must be called before emit_c_code()")
|
||||
module_name, source, source_extension, kwds = self._assigned_source
|
||||
if source is not None:
|
||||
raise TypeError("emit_python_code() is only for dlopen()-style "
|
||||
"pure Python modules, not for C extension modules")
|
||||
recompile(self, module_name, source,
|
||||
c_file=filename, call_c_compiler=False,
|
||||
uses_ffiplatform=False, **kwds)
|
||||
|
||||
def compile(self, tmpdir='.', verbose=0, target=None, debug=None):
|
||||
"""The 'target' argument gives the final file name of the
|
||||
compiled DLL. Use '*' to force distutils' choice, suitable for
|
||||
regular CPython C API modules. Use a file name ending in '.*'
|
||||
to ask for the system's default extension for dynamic libraries
|
||||
(.so/.dll/.dylib).
|
||||
|
||||
The default is '*' when building a non-embedded C API extension,
|
||||
and (module_name + '.*') when building an embedded library.
|
||||
"""
|
||||
from .recompiler import recompile
|
||||
#
|
||||
if not hasattr(self, '_assigned_source'):
|
||||
raise ValueError("set_source() must be called before compile()")
|
||||
module_name, source, source_extension, kwds = self._assigned_source
|
||||
return recompile(self, module_name, source, tmpdir=tmpdir,
|
||||
target=target, source_extension=source_extension,
|
||||
compiler_verbose=verbose, debug=debug, **kwds)
|
||||
|
||||
def init_once(self, func, tag):
|
||||
# Read _init_once_cache[tag], which is either (False, lock) if
|
||||
# we're calling the function now in some thread, or (True, result).
|
||||
# Don't call setdefault() in most cases, to avoid allocating and
|
||||
# immediately freeing a lock; but still use setdefaut() to avoid
|
||||
# races.
|
||||
try:
|
||||
x = self._init_once_cache[tag]
|
||||
except KeyError:
|
||||
x = self._init_once_cache.setdefault(tag, (False, allocate_lock()))
|
||||
# Common case: we got (True, result), so we return the result.
|
||||
if x[0]:
|
||||
return x[1]
|
||||
# Else, it's a lock. Acquire it to serialize the following tests.
|
||||
with x[1]:
|
||||
# Read again from _init_once_cache the current status.
|
||||
x = self._init_once_cache[tag]
|
||||
if x[0]:
|
||||
return x[1]
|
||||
# Call the function and store the result back.
|
||||
result = func()
|
||||
self._init_once_cache[tag] = (True, result)
|
||||
return result
|
||||
|
||||
def embedding_init_code(self, pysource):
|
||||
if self._embedding:
|
||||
raise ValueError("embedding_init_code() can only be called once")
|
||||
# fix 'pysource' before it gets dumped into the C file:
|
||||
# - remove empty lines at the beginning, so it starts at "line 1"
|
||||
# - dedent, if all non-empty lines are indented
|
||||
# - check for SyntaxErrors
|
||||
import re
|
||||
match = re.match(r'\s*\n', pysource)
|
||||
if match:
|
||||
pysource = pysource[match.end():]
|
||||
lines = pysource.splitlines() or ['']
|
||||
prefix = re.match(r'\s*', lines[0]).group()
|
||||
for i in range(1, len(lines)):
|
||||
line = lines[i]
|
||||
if line.rstrip():
|
||||
while not line.startswith(prefix):
|
||||
prefix = prefix[:-1]
|
||||
i = len(prefix)
|
||||
lines = [line[i:]+'\n' for line in lines]
|
||||
pysource = ''.join(lines)
|
||||
#
|
||||
compile(pysource, "cffi_init", "exec")
|
||||
#
|
||||
self._embedding = pysource
|
||||
|
||||
def def_extern(self, *args, **kwds):
|
||||
raise ValueError("ffi.def_extern() is only available on API-mode FFI "
|
||||
"objects")
|
||||
|
||||
def list_types(self):
|
||||
"""Returns the user type names known to this FFI instance.
|
||||
This returns a tuple containing three lists of names:
|
||||
(typedef_names, names_of_structs, names_of_unions)
|
||||
"""
|
||||
typedefs = []
|
||||
structs = []
|
||||
unions = []
|
||||
for key in self._parser._declarations:
|
||||
if key.startswith('typedef '):
|
||||
typedefs.append(key[8:])
|
||||
elif key.startswith('struct '):
|
||||
structs.append(key[7:])
|
||||
elif key.startswith('union '):
|
||||
unions.append(key[6:])
|
||||
typedefs.sort()
|
||||
structs.sort()
|
||||
unions.sort()
|
||||
return (typedefs, structs, unions)
|
||||
|
||||
|
||||
def _load_backend_lib(backend, name, flags):
|
||||
import os
|
||||
if not isinstance(name, basestring):
|
||||
if sys.platform != "win32" or name is not None:
|
||||
return backend.load_library(name, flags)
|
||||
name = "c" # Windows: load_library(None) fails, but this works
|
||||
# on Python 2 (backward compatibility hack only)
|
||||
first_error = None
|
||||
if '.' in name or '/' in name or os.sep in name:
|
||||
try:
|
||||
return backend.load_library(name, flags)
|
||||
except OSError as e:
|
||||
first_error = e
|
||||
import ctypes.util
|
||||
path = ctypes.util.find_library(name)
|
||||
if path is None:
|
||||
if name == "c" and sys.platform == "win32" and sys.version_info >= (3,):
|
||||
raise OSError("dlopen(None) cannot work on Windows for Python 3 "
|
||||
"(see http://bugs.python.org/issue23606)")
|
||||
msg = ("ctypes.util.find_library() did not manage "
|
||||
"to locate a library called %r" % (name,))
|
||||
if first_error is not None:
|
||||
msg = "%s. Additionally, %s" % (first_error, msg)
|
||||
raise OSError(msg)
|
||||
return backend.load_library(path, flags)
|
||||
|
||||
def _make_ffi_library(ffi, libname, flags):
|
||||
backend = ffi._backend
|
||||
backendlib = _load_backend_lib(backend, libname, flags)
|
||||
#
|
||||
def accessor_function(name):
|
||||
key = 'function ' + name
|
||||
tp, _ = ffi._parser._declarations[key]
|
||||
BType = ffi._get_cached_btype(tp)
|
||||
value = backendlib.load_function(BType, name)
|
||||
library.__dict__[name] = value
|
||||
#
|
||||
def accessor_variable(name):
|
||||
key = 'variable ' + name
|
||||
tp, _ = ffi._parser._declarations[key]
|
||||
BType = ffi._get_cached_btype(tp)
|
||||
read_variable = backendlib.read_variable
|
||||
write_variable = backendlib.write_variable
|
||||
setattr(FFILibrary, name, property(
|
||||
lambda self: read_variable(BType, name),
|
||||
lambda self, value: write_variable(BType, name, value)))
|
||||
#
|
||||
def addressof_var(name):
|
||||
try:
|
||||
return addr_variables[name]
|
||||
except KeyError:
|
||||
with ffi._lock:
|
||||
if name not in addr_variables:
|
||||
key = 'variable ' + name
|
||||
tp, _ = ffi._parser._declarations[key]
|
||||
BType = ffi._get_cached_btype(tp)
|
||||
if BType.kind != 'array':
|
||||
BType = model.pointer_cache(ffi, BType)
|
||||
p = backendlib.load_function(BType, name)
|
||||
addr_variables[name] = p
|
||||
return addr_variables[name]
|
||||
#
|
||||
def accessor_constant(name):
|
||||
raise NotImplementedError("non-integer constant '%s' cannot be "
|
||||
"accessed from a dlopen() library" % (name,))
|
||||
#
|
||||
def accessor_int_constant(name):
|
||||
library.__dict__[name] = ffi._parser._int_constants[name]
|
||||
#
|
||||
accessors = {}
|
||||
accessors_version = [False]
|
||||
addr_variables = {}
|
||||
#
|
||||
def update_accessors():
|
||||
if accessors_version[0] is ffi._cdef_version:
|
||||
return
|
||||
#
|
||||
for key, (tp, _) in ffi._parser._declarations.items():
|
||||
if not isinstance(tp, model.EnumType):
|
||||
tag, name = key.split(' ', 1)
|
||||
if tag == 'function':
|
||||
accessors[name] = accessor_function
|
||||
elif tag == 'variable':
|
||||
accessors[name] = accessor_variable
|
||||
elif tag == 'constant':
|
||||
accessors[name] = accessor_constant
|
||||
else:
|
||||
for i, enumname in enumerate(tp.enumerators):
|
||||
def accessor_enum(name, tp=tp, i=i):
|
||||
tp.check_not_partial()
|
||||
library.__dict__[name] = tp.enumvalues[i]
|
||||
accessors[enumname] = accessor_enum
|
||||
for name in ffi._parser._int_constants:
|
||||
accessors.setdefault(name, accessor_int_constant)
|
||||
accessors_version[0] = ffi._cdef_version
|
||||
#
|
||||
def make_accessor(name):
|
||||
with ffi._lock:
|
||||
if name in library.__dict__ or name in FFILibrary.__dict__:
|
||||
return # added by another thread while waiting for the lock
|
||||
if name not in accessors:
|
||||
update_accessors()
|
||||
if name not in accessors:
|
||||
raise AttributeError(name)
|
||||
accessors[name](name)
|
||||
#
|
||||
class FFILibrary(object):
|
||||
def __getattr__(self, name):
|
||||
make_accessor(name)
|
||||
return getattr(self, name)
|
||||
def __setattr__(self, name, value):
|
||||
try:
|
||||
property = getattr(self.__class__, name)
|
||||
except AttributeError:
|
||||
make_accessor(name)
|
||||
setattr(self, name, value)
|
||||
else:
|
||||
property.__set__(self, value)
|
||||
def __dir__(self):
|
||||
with ffi._lock:
|
||||
update_accessors()
|
||||
return accessors.keys()
|
||||
def __addressof__(self, name):
|
||||
if name in library.__dict__:
|
||||
return library.__dict__[name]
|
||||
if name in FFILibrary.__dict__:
|
||||
return addressof_var(name)
|
||||
make_accessor(name)
|
||||
if name in library.__dict__:
|
||||
return library.__dict__[name]
|
||||
if name in FFILibrary.__dict__:
|
||||
return addressof_var(name)
|
||||
raise AttributeError("cffi library has no function or "
|
||||
"global variable named '%s'" % (name,))
|
||||
def __cffi_close__(self):
|
||||
backendlib.close_lib()
|
||||
self.__dict__.clear()
|
||||
#
|
||||
if isinstance(libname, basestring):
|
||||
try:
|
||||
if not isinstance(libname, str): # unicode, on Python 2
|
||||
libname = libname.encode('utf-8')
|
||||
FFILibrary.__name__ = 'FFILibrary_%s' % libname
|
||||
except UnicodeError:
|
||||
pass
|
||||
library = FFILibrary()
|
||||
return library, library.__dict__
|
||||
|
||||
def _builtin_function_type(func):
|
||||
# a hack to make at least ffi.typeof(builtin_function) work,
|
||||
# if the builtin function was obtained by 'vengine_cpy'.
|
||||
import sys
|
||||
try:
|
||||
module = sys.modules[func.__module__]
|
||||
ffi = module._cffi_original_ffi
|
||||
types_of_builtin_funcs = module._cffi_types_of_builtin_funcs
|
||||
tp = types_of_builtin_funcs[func]
|
||||
except (KeyError, AttributeError, TypeError):
|
||||
return None
|
||||
else:
|
||||
with ffi._lock:
|
||||
return ffi._get_cached_btype(tp)
|
||||
1121
venv/lib/python3.12/site-packages/cffi/backend_ctypes.py
Normal file
1121
venv/lib/python3.12/site-packages/cffi/backend_ctypes.py
Normal file
File diff suppressed because it is too large
Load Diff
187
venv/lib/python3.12/site-packages/cffi/cffi_opcode.py
Normal file
187
venv/lib/python3.12/site-packages/cffi/cffi_opcode.py
Normal file
@ -0,0 +1,187 @@
|
||||
from .error import VerificationError
|
||||
|
||||
class CffiOp(object):
|
||||
def __init__(self, op, arg):
|
||||
self.op = op
|
||||
self.arg = arg
|
||||
|
||||
def as_c_expr(self):
|
||||
if self.op is None:
|
||||
assert isinstance(self.arg, str)
|
||||
return '(_cffi_opcode_t)(%s)' % (self.arg,)
|
||||
classname = CLASS_NAME[self.op]
|
||||
return '_CFFI_OP(_CFFI_OP_%s, %s)' % (classname, self.arg)
|
||||
|
||||
def as_python_bytes(self):
|
||||
if self.op is None and self.arg.isdigit():
|
||||
value = int(self.arg) # non-negative: '-' not in self.arg
|
||||
if value >= 2**31:
|
||||
raise OverflowError("cannot emit %r: limited to 2**31-1"
|
||||
% (self.arg,))
|
||||
return format_four_bytes(value)
|
||||
if isinstance(self.arg, str):
|
||||
raise VerificationError("cannot emit to Python: %r" % (self.arg,))
|
||||
return format_four_bytes((self.arg << 8) | self.op)
|
||||
|
||||
def __str__(self):
|
||||
classname = CLASS_NAME.get(self.op, self.op)
|
||||
return '(%s %s)' % (classname, self.arg)
|
||||
|
||||
def format_four_bytes(num):
|
||||
return '\\x%02X\\x%02X\\x%02X\\x%02X' % (
|
||||
(num >> 24) & 0xFF,
|
||||
(num >> 16) & 0xFF,
|
||||
(num >> 8) & 0xFF,
|
||||
(num ) & 0xFF)
|
||||
|
||||
OP_PRIMITIVE = 1
|
||||
OP_POINTER = 3
|
||||
OP_ARRAY = 5
|
||||
OP_OPEN_ARRAY = 7
|
||||
OP_STRUCT_UNION = 9
|
||||
OP_ENUM = 11
|
||||
OP_FUNCTION = 13
|
||||
OP_FUNCTION_END = 15
|
||||
OP_NOOP = 17
|
||||
OP_BITFIELD = 19
|
||||
OP_TYPENAME = 21
|
||||
OP_CPYTHON_BLTN_V = 23 # varargs
|
||||
OP_CPYTHON_BLTN_N = 25 # noargs
|
||||
OP_CPYTHON_BLTN_O = 27 # O (i.e. a single arg)
|
||||
OP_CONSTANT = 29
|
||||
OP_CONSTANT_INT = 31
|
||||
OP_GLOBAL_VAR = 33
|
||||
OP_DLOPEN_FUNC = 35
|
||||
OP_DLOPEN_CONST = 37
|
||||
OP_GLOBAL_VAR_F = 39
|
||||
OP_EXTERN_PYTHON = 41
|
||||
|
||||
PRIM_VOID = 0
|
||||
PRIM_BOOL = 1
|
||||
PRIM_CHAR = 2
|
||||
PRIM_SCHAR = 3
|
||||
PRIM_UCHAR = 4
|
||||
PRIM_SHORT = 5
|
||||
PRIM_USHORT = 6
|
||||
PRIM_INT = 7
|
||||
PRIM_UINT = 8
|
||||
PRIM_LONG = 9
|
||||
PRIM_ULONG = 10
|
||||
PRIM_LONGLONG = 11
|
||||
PRIM_ULONGLONG = 12
|
||||
PRIM_FLOAT = 13
|
||||
PRIM_DOUBLE = 14
|
||||
PRIM_LONGDOUBLE = 15
|
||||
|
||||
PRIM_WCHAR = 16
|
||||
PRIM_INT8 = 17
|
||||
PRIM_UINT8 = 18
|
||||
PRIM_INT16 = 19
|
||||
PRIM_UINT16 = 20
|
||||
PRIM_INT32 = 21
|
||||
PRIM_UINT32 = 22
|
||||
PRIM_INT64 = 23
|
||||
PRIM_UINT64 = 24
|
||||
PRIM_INTPTR = 25
|
||||
PRIM_UINTPTR = 26
|
||||
PRIM_PTRDIFF = 27
|
||||
PRIM_SIZE = 28
|
||||
PRIM_SSIZE = 29
|
||||
PRIM_INT_LEAST8 = 30
|
||||
PRIM_UINT_LEAST8 = 31
|
||||
PRIM_INT_LEAST16 = 32
|
||||
PRIM_UINT_LEAST16 = 33
|
||||
PRIM_INT_LEAST32 = 34
|
||||
PRIM_UINT_LEAST32 = 35
|
||||
PRIM_INT_LEAST64 = 36
|
||||
PRIM_UINT_LEAST64 = 37
|
||||
PRIM_INT_FAST8 = 38
|
||||
PRIM_UINT_FAST8 = 39
|
||||
PRIM_INT_FAST16 = 40
|
||||
PRIM_UINT_FAST16 = 41
|
||||
PRIM_INT_FAST32 = 42
|
||||
PRIM_UINT_FAST32 = 43
|
||||
PRIM_INT_FAST64 = 44
|
||||
PRIM_UINT_FAST64 = 45
|
||||
PRIM_INTMAX = 46
|
||||
PRIM_UINTMAX = 47
|
||||
PRIM_FLOATCOMPLEX = 48
|
||||
PRIM_DOUBLECOMPLEX = 49
|
||||
PRIM_CHAR16 = 50
|
||||
PRIM_CHAR32 = 51
|
||||
|
||||
_NUM_PRIM = 52
|
||||
_UNKNOWN_PRIM = -1
|
||||
_UNKNOWN_FLOAT_PRIM = -2
|
||||
_UNKNOWN_LONG_DOUBLE = -3
|
||||
|
||||
_IO_FILE_STRUCT = -1
|
||||
|
||||
PRIMITIVE_TO_INDEX = {
|
||||
'char': PRIM_CHAR,
|
||||
'short': PRIM_SHORT,
|
||||
'int': PRIM_INT,
|
||||
'long': PRIM_LONG,
|
||||
'long long': PRIM_LONGLONG,
|
||||
'signed char': PRIM_SCHAR,
|
||||
'unsigned char': PRIM_UCHAR,
|
||||
'unsigned short': PRIM_USHORT,
|
||||
'unsigned int': PRIM_UINT,
|
||||
'unsigned long': PRIM_ULONG,
|
||||
'unsigned long long': PRIM_ULONGLONG,
|
||||
'float': PRIM_FLOAT,
|
||||
'double': PRIM_DOUBLE,
|
||||
'long double': PRIM_LONGDOUBLE,
|
||||
'_cffi_float_complex_t': PRIM_FLOATCOMPLEX,
|
||||
'_cffi_double_complex_t': PRIM_DOUBLECOMPLEX,
|
||||
'_Bool': PRIM_BOOL,
|
||||
'wchar_t': PRIM_WCHAR,
|
||||
'char16_t': PRIM_CHAR16,
|
||||
'char32_t': PRIM_CHAR32,
|
||||
'int8_t': PRIM_INT8,
|
||||
'uint8_t': PRIM_UINT8,
|
||||
'int16_t': PRIM_INT16,
|
||||
'uint16_t': PRIM_UINT16,
|
||||
'int32_t': PRIM_INT32,
|
||||
'uint32_t': PRIM_UINT32,
|
||||
'int64_t': PRIM_INT64,
|
||||
'uint64_t': PRIM_UINT64,
|
||||
'intptr_t': PRIM_INTPTR,
|
||||
'uintptr_t': PRIM_UINTPTR,
|
||||
'ptrdiff_t': PRIM_PTRDIFF,
|
||||
'size_t': PRIM_SIZE,
|
||||
'ssize_t': PRIM_SSIZE,
|
||||
'int_least8_t': PRIM_INT_LEAST8,
|
||||
'uint_least8_t': PRIM_UINT_LEAST8,
|
||||
'int_least16_t': PRIM_INT_LEAST16,
|
||||
'uint_least16_t': PRIM_UINT_LEAST16,
|
||||
'int_least32_t': PRIM_INT_LEAST32,
|
||||
'uint_least32_t': PRIM_UINT_LEAST32,
|
||||
'int_least64_t': PRIM_INT_LEAST64,
|
||||
'uint_least64_t': PRIM_UINT_LEAST64,
|
||||
'int_fast8_t': PRIM_INT_FAST8,
|
||||
'uint_fast8_t': PRIM_UINT_FAST8,
|
||||
'int_fast16_t': PRIM_INT_FAST16,
|
||||
'uint_fast16_t': PRIM_UINT_FAST16,
|
||||
'int_fast32_t': PRIM_INT_FAST32,
|
||||
'uint_fast32_t': PRIM_UINT_FAST32,
|
||||
'int_fast64_t': PRIM_INT_FAST64,
|
||||
'uint_fast64_t': PRIM_UINT_FAST64,
|
||||
'intmax_t': PRIM_INTMAX,
|
||||
'uintmax_t': PRIM_UINTMAX,
|
||||
}
|
||||
|
||||
F_UNION = 0x01
|
||||
F_CHECK_FIELDS = 0x02
|
||||
F_PACKED = 0x04
|
||||
F_EXTERNAL = 0x08
|
||||
F_OPAQUE = 0x10
|
||||
|
||||
G_FLAGS = dict([('_CFFI_' + _key, globals()[_key])
|
||||
for _key in ['F_UNION', 'F_CHECK_FIELDS', 'F_PACKED',
|
||||
'F_EXTERNAL', 'F_OPAQUE']])
|
||||
|
||||
CLASS_NAME = {}
|
||||
for _name, _value in list(globals().items()):
|
||||
if _name.startswith('OP_') and isinstance(_value, int):
|
||||
CLASS_NAME[_value] = _name[3:]
|
||||
82
venv/lib/python3.12/site-packages/cffi/commontypes.py
Normal file
82
venv/lib/python3.12/site-packages/cffi/commontypes.py
Normal file
@ -0,0 +1,82 @@
|
||||
import sys
|
||||
from . import model
|
||||
from .error import FFIError
|
||||
|
||||
|
||||
COMMON_TYPES = {}
|
||||
|
||||
try:
|
||||
# fetch "bool" and all simple Windows types
|
||||
from _cffi_backend import _get_common_types
|
||||
_get_common_types(COMMON_TYPES)
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
COMMON_TYPES['FILE'] = model.unknown_type('FILE', '_IO_FILE')
|
||||
COMMON_TYPES['bool'] = '_Bool' # in case we got ImportError above
|
||||
COMMON_TYPES['float _Complex'] = '_cffi_float_complex_t'
|
||||
COMMON_TYPES['double _Complex'] = '_cffi_double_complex_t'
|
||||
|
||||
for _type in model.PrimitiveType.ALL_PRIMITIVE_TYPES:
|
||||
if _type.endswith('_t'):
|
||||
COMMON_TYPES[_type] = _type
|
||||
del _type
|
||||
|
||||
_CACHE = {}
|
||||
|
||||
def resolve_common_type(parser, commontype):
|
||||
try:
|
||||
return _CACHE[commontype]
|
||||
except KeyError:
|
||||
cdecl = COMMON_TYPES.get(commontype, commontype)
|
||||
if not isinstance(cdecl, str):
|
||||
result, quals = cdecl, 0 # cdecl is already a BaseType
|
||||
elif cdecl in model.PrimitiveType.ALL_PRIMITIVE_TYPES:
|
||||
result, quals = model.PrimitiveType(cdecl), 0
|
||||
elif cdecl == 'set-unicode-needed':
|
||||
raise FFIError("The Windows type %r is only available after "
|
||||
"you call ffi.set_unicode()" % (commontype,))
|
||||
else:
|
||||
if commontype == cdecl:
|
||||
raise FFIError(
|
||||
"Unsupported type: %r. Please look at "
|
||||
"http://cffi.readthedocs.io/en/latest/cdef.html#ffi-cdef-limitations "
|
||||
"and file an issue if you think this type should really "
|
||||
"be supported." % (commontype,))
|
||||
result, quals = parser.parse_type_and_quals(cdecl) # recursive
|
||||
|
||||
assert isinstance(result, model.BaseTypeByIdentity)
|
||||
_CACHE[commontype] = result, quals
|
||||
return result, quals
|
||||
|
||||
|
||||
# ____________________________________________________________
|
||||
# extra types for Windows (most of them are in commontypes.c)
|
||||
|
||||
|
||||
def win_common_types():
|
||||
return {
|
||||
"UNICODE_STRING": model.StructType(
|
||||
"_UNICODE_STRING",
|
||||
["Length",
|
||||
"MaximumLength",
|
||||
"Buffer"],
|
||||
[model.PrimitiveType("unsigned short"),
|
||||
model.PrimitiveType("unsigned short"),
|
||||
model.PointerType(model.PrimitiveType("wchar_t"))],
|
||||
[-1, -1, -1]),
|
||||
"PUNICODE_STRING": "UNICODE_STRING *",
|
||||
"PCUNICODE_STRING": "const UNICODE_STRING *",
|
||||
|
||||
"TBYTE": "set-unicode-needed",
|
||||
"TCHAR": "set-unicode-needed",
|
||||
"LPCTSTR": "set-unicode-needed",
|
||||
"PCTSTR": "set-unicode-needed",
|
||||
"LPTSTR": "set-unicode-needed",
|
||||
"PTSTR": "set-unicode-needed",
|
||||
"PTBYTE": "set-unicode-needed",
|
||||
"PTCHAR": "set-unicode-needed",
|
||||
}
|
||||
|
||||
if sys.platform == 'win32':
|
||||
COMMON_TYPES.update(win_common_types())
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user