Add podgrab featureset
This commit is contained in:
parent
095bf52a2f
commit
233dd5b5c0
33 changed files with 2315 additions and 125 deletions
|
@ -3,7 +3,7 @@ FLASK_ENV=development
|
|||
SECRET_KEY=your_secret_key_here
|
||||
|
||||
# Database configuration
|
||||
DATABASE_URI=sqlite:///podcastrr.db
|
||||
DATABASE_URI=sqlite:///instance/podcastrr.db
|
||||
|
||||
# Application configuration
|
||||
DOWNLOAD_PATH=C:\path\to\downloads
|
||||
|
@ -12,4 +12,4 @@ LOG_LEVEL=INFO
|
|||
# API Keys (if needed)
|
||||
# ITUNES_API_KEY=your_itunes_api_key
|
||||
# SPOTIFY_CLIENT_ID=your_spotify_client_id
|
||||
# SPOTIFY_CLIENT_SECRET=your_spotify_client_secret
|
||||
# SPOTIFY_CLIENT_SECRET=your_spotify_client_secret
|
||||
|
|
1
.gitignore
vendored
1
.gitignore
vendored
|
@ -2,3 +2,4 @@
|
|||
/instance/podcastrr.db
|
||||
/podcastrr.db
|
||||
/.venv/
|
||||
/downloads/
|
||||
|
|
101
MIGRATION_GUIDE.md
Normal file
101
MIGRATION_GUIDE.md
Normal file
|
@ -0,0 +1,101 @@
|
|||
# Podcastrr Migration Guide
|
||||
|
||||
## Resolving Common Database Errors
|
||||
|
||||
### "no such column: podcasts.tags" Error
|
||||
|
||||
If you encounter the following error when accessing podcast pages:
|
||||
|
||||
```
|
||||
sqlite3.OperationalError: no such column: podcasts.tags
|
||||
```
|
||||
|
||||
This means that your database schema is out of date and missing the `tags` column in the `podcasts` table. This can happen if you've updated the codebase but haven't run the necessary database migrations.
|
||||
|
||||
### "no such table: episodes" Error
|
||||
|
||||
If you encounter the following error when starting the application:
|
||||
|
||||
```
|
||||
Error running migration: no such table: episodes
|
||||
```
|
||||
|
||||
This means that the migration script is trying to modify the episodes table before it has been created. This issue has been fixed in the latest version of the application, which ensures that tables are created before migrations are run.
|
||||
|
||||
## How to Fix the Issue
|
||||
|
||||
### Option 1: Run the Migration Script
|
||||
|
||||
The simplest way to fix this issue is to run the provided migration script:
|
||||
|
||||
```
|
||||
python run_migrations.py
|
||||
```
|
||||
|
||||
This script will run all necessary migrations, including adding the `tags` column to the `podcasts` table.
|
||||
|
||||
### Option 2: Reinitialize the Database
|
||||
|
||||
If you're still having issues, you can reinitialize the database:
|
||||
|
||||
```
|
||||
python init_db.py
|
||||
```
|
||||
|
||||
This will create all tables and run all migrations. Note that this will preserve your existing data.
|
||||
|
||||
## Understanding Migrations in Podcastrr
|
||||
|
||||
Podcastrr uses a simple migration system to update the database schema when new features are added. Migrations are stored in the `migrations` directory and are automatically run when:
|
||||
|
||||
1. The application starts (via `application.py`)
|
||||
2. The database is initialized (via `init_db.py`)
|
||||
3. The migration script is run (via `run_migrations.py`)
|
||||
|
||||
## Adding New Migrations
|
||||
|
||||
If you need to add a new migration:
|
||||
|
||||
1. Create a new Python file in the `migrations` directory (e.g., `add_new_feature.py`)
|
||||
2. Implement a `run_migration()` function that makes the necessary database changes
|
||||
3. The migration will be automatically discovered and run by the application
|
||||
|
||||
Example migration structure:
|
||||
|
||||
```python
|
||||
"""
|
||||
Migration script to add a new feature.
|
||||
"""
|
||||
import sqlite3
|
||||
from flask import current_app
|
||||
|
||||
def run_migration():
|
||||
"""
|
||||
Run the migration to add the new feature.
|
||||
"""
|
||||
# Get the database path
|
||||
db_path = current_app.config['SQLALCHEMY_DATABASE_URI'].replace('sqlite:///', '')
|
||||
|
||||
# Connect to the database
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Make database changes
|
||||
# ...
|
||||
|
||||
# Commit and close
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
print("Migration completed successfully!")
|
||||
return True
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
If you're still experiencing issues after running the migrations:
|
||||
|
||||
1. Check the application logs for detailed error messages
|
||||
2. Verify that the database file exists and is accessible
|
||||
3. Ensure that the migration scripts are in the correct location
|
||||
4. Try restarting the application after running the migrations
|
22
README.md
22
README.md
|
@ -7,7 +7,12 @@ A podcast management application similar to Sonarr but for podcasts, built with
|
|||
- **Search for Podcasts**: Find podcasts from various sources
|
||||
- **Track Podcasts**: Monitor your favorite podcasts for new episodes
|
||||
- **Download Management**: Automatically download new episodes and manage storage
|
||||
- **Complete Podcast Archive**: Download all episodes of a podcast with one click
|
||||
- **Custom Naming**: Configure how downloaded files are named
|
||||
- **Tag/Label System**: Organize podcasts into groups with tags
|
||||
- **Direct RSS Feed**: Add podcasts using direct RSS feed URLs
|
||||
- **OPML Import/Export**: Easily import and export podcast subscriptions
|
||||
- **Existing Episode Detection**: Prevent re-downloading files if already present
|
||||
- **Web Interface**: Manage everything through an intuitive web interface
|
||||
|
||||
## Requirements
|
||||
|
@ -47,11 +52,18 @@ A podcast management application similar to Sonarr but for podcasts, built with
|
|||
|
||||
5. Initialize the database:
|
||||
```
|
||||
flask db init
|
||||
flask db migrate
|
||||
flask db upgrade
|
||||
python init_db.py
|
||||
```
|
||||
|
||||
This will create the database and run all necessary migrations.
|
||||
|
||||
6. If you're updating an existing installation and encounter database errors:
|
||||
```
|
||||
python run_migrations.py
|
||||
```
|
||||
|
||||
This will apply any pending migrations to your database. See the [Migration Guide](MIGRATION_GUIDE.md) for more details.
|
||||
|
||||
## Usage
|
||||
|
||||
Run the application:
|
||||
|
@ -77,7 +89,3 @@ Then open your browser and navigate to `http://localhost:5000`.
|
|||
```
|
||||
black .
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
93
README_DATABASE_FIX.md
Normal file
93
README_DATABASE_FIX.md
Normal file
|
@ -0,0 +1,93 @@
|
|||
# Fix for "unable to open database file" Error
|
||||
|
||||
## Problem
|
||||
|
||||
The application was encountering the following error when starting up:
|
||||
|
||||
```
|
||||
Error during database initialization: (sqlite3.OperationalError) unable to open database file
|
||||
```
|
||||
|
||||
This error occurs when SQLite can't access the database file, which could be due to:
|
||||
1. The directory for the database file doesn't exist
|
||||
2. The application doesn't have permission to create or access the database file
|
||||
3. The path to the database file is incorrect
|
||||
|
||||
## Root Cause
|
||||
|
||||
The issue was caused by the application trying to create a database file in the `instance` directory, but not ensuring that the directory exists first. The database path was correctly configured as `sqlite:///instance/podcastrr.db`, but the application didn't create the `instance` directory before trying to create the database file.
|
||||
|
||||
## Solution
|
||||
|
||||
The solution involved several enhancements to the database initialization process:
|
||||
|
||||
1. **Test if the instance directory is writable**: Added code to test if the instance directory is writable by creating and removing a test file:
|
||||
```python
|
||||
# Create an empty file in the instance directory to ensure it's writable
|
||||
try:
|
||||
test_file_path = os.path.join(instance_path, '.test_write')
|
||||
with open(test_file_path, 'w') as f:
|
||||
f.write('test')
|
||||
os.remove(test_file_path)
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error writing to instance directory: {str(e)}")
|
||||
app.logger.error("This may indicate a permissions issue with the instance directory.")
|
||||
```
|
||||
|
||||
2. **Test if the database directory is writable**: Added code to test if the database directory is writable:
|
||||
```python
|
||||
# Test if we can write to the database directory
|
||||
try:
|
||||
test_file_path = os.path.join(db_dir if db_dir else '.', '.test_db_write')
|
||||
with open(test_file_path, 'w') as f:
|
||||
f.write('test')
|
||||
os.remove(test_file_path)
|
||||
app.logger.info("Database directory is writable")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error writing to database directory: {str(e)}")
|
||||
app.logger.error("This may indicate a permissions issue with the database directory.")
|
||||
```
|
||||
|
||||
3. **Attempt to create the database file directly**: If the directory tests fail, try to create the database file directly:
|
||||
```python
|
||||
# Try to create the database file directly to see if that works
|
||||
try:
|
||||
with open(db_path, 'a'):
|
||||
pass
|
||||
app.logger.info(f"Created empty database file: {db_path}")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error creating database file: {str(e)}")
|
||||
app.logger.error("This may indicate a permissions issue with the database file.")
|
||||
```
|
||||
|
||||
4. **Improved logging**: Added more detailed logging throughout the process to help diagnose issues:
|
||||
```python
|
||||
app.logger.info(f"Database path: {db_path}")
|
||||
app.logger.info(f"Created database directory: {db_dir}")
|
||||
app.logger.info("Creating database tables...")
|
||||
app.logger.info("Database tables created successfully")
|
||||
```
|
||||
|
||||
## How to Verify the Solution
|
||||
|
||||
1. Run the application:
|
||||
```
|
||||
python main.py
|
||||
```
|
||||
|
||||
2. Verify that the application starts without any database-related errors.
|
||||
|
||||
3. Check the logs for any error messages related to database initialization.
|
||||
|
||||
4. Check that the database file has been created in the `instance` directory.
|
||||
|
||||
## Preventing Similar Issues in the Future
|
||||
|
||||
To prevent similar issues in the future:
|
||||
|
||||
1. Always ensure that directories exist before trying to create files in them.
|
||||
2. Test if directories and files are writable before attempting operations on them.
|
||||
3. Use Flask's built-in `app.instance_path` to get the correct instance directory path.
|
||||
4. Add proper error handling and logging to help diagnose issues.
|
||||
5. Consider using a more robust database setup process that handles these edge cases automatically.
|
||||
6. Implement a database connection retry mechanism for transient issues.
|
162
README_DATABASE_FIX_V2.md
Normal file
162
README_DATABASE_FIX_V2.md
Normal file
|
@ -0,0 +1,162 @@
|
|||
# Fix for "unable to open database file" Error - Version 2
|
||||
|
||||
## Problem
|
||||
|
||||
The application was encountering the following error when starting up:
|
||||
|
||||
```
|
||||
Error during database initialization: (sqlite3.OperationalError) unable to open database file
|
||||
```
|
||||
|
||||
Despite previous fixes to ensure the instance directory exists and is writable, the application was still unable to create or access the database file.
|
||||
|
||||
## Root Cause
|
||||
|
||||
The issue could be caused by several factors:
|
||||
|
||||
1. The database file path might be incorrect or inaccessible
|
||||
2. There might be permission issues with the database file
|
||||
3. The database directory might not be writable
|
||||
4. There might be a locking issue with the database file
|
||||
5. SQLAlchemy might be having issues connecting to the database
|
||||
|
||||
## Solution
|
||||
|
||||
The solution involved several enhancements to the database initialization process:
|
||||
|
||||
### 1. Using Absolute Paths for Database File
|
||||
|
||||
Modified the database connection string to use an absolute path to the database file:
|
||||
|
||||
```python
|
||||
SQLALCHEMY_DATABASE_URI=os.environ.get('DATABASE_URI', f'sqlite:///{os.path.abspath(os.path.join(os.path.dirname(__file__), "instance", "podcastrr.db"))}')
|
||||
```
|
||||
|
||||
This ensures that SQLite can find the database file regardless of the current working directory.
|
||||
|
||||
### 2. Enhanced Database File Checks
|
||||
|
||||
Added more comprehensive checks for the database file:
|
||||
|
||||
- Check if the database file exists
|
||||
- Check if the database file is readable and writable
|
||||
- Attempt to fix permissions if there are issues
|
||||
- Create the database file if it doesn't exist
|
||||
- Set appropriate permissions on the newly created file
|
||||
|
||||
### 3. Retry Mechanism for Database Connection
|
||||
|
||||
Added a retry mechanism for database connection:
|
||||
|
||||
```python
|
||||
# Try to create the database tables with retries
|
||||
max_retries = 3
|
||||
retry_count = 0
|
||||
while retry_count < max_retries:
|
||||
try:
|
||||
# Test the database connection first
|
||||
with db.engine.connect() as conn:
|
||||
app.logger.info("Database connection successful")
|
||||
|
||||
# Create the tables
|
||||
db.create_all()
|
||||
break
|
||||
except Exception as e:
|
||||
retry_count += 1
|
||||
app.logger.error(f"Error creating database tables (attempt {retry_count}/{max_retries}): {str(e)}")
|
||||
if retry_count >= max_retries:
|
||||
app.logger.error("Maximum retry attempts reached. Could not create database tables.")
|
||||
raise
|
||||
import time
|
||||
time.sleep(1) # Wait a second before retrying
|
||||
```
|
||||
|
||||
This helps with transient connection issues by attempting to connect multiple times before giving up.
|
||||
|
||||
### 4. Fallback to In-Memory Database
|
||||
|
||||
Added a fallback mechanism that uses an in-memory SQLite database if all attempts to use a file-based database fail:
|
||||
|
||||
```python
|
||||
# If we've tried multiple times and still failing, try a fallback approach
|
||||
if retry_count >= max_retries and not fallback_used:
|
||||
app.logger.warning("Maximum retry attempts reached. Trying fallback approach...")
|
||||
|
||||
try:
|
||||
# Create a fallback database in memory
|
||||
app.logger.info("Attempting to use in-memory SQLite database as fallback")
|
||||
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///:memory:'
|
||||
|
||||
# Reinitialize the database with the new connection string
|
||||
db.init_app(app)
|
||||
|
||||
# Reset retry counter and set fallback flag
|
||||
retry_count = 0
|
||||
fallback_used = True
|
||||
|
||||
# Add a warning message that will be displayed in the application
|
||||
app.config['DB_FALLBACK_WARNING'] = True
|
||||
app.logger.warning("WARNING: Using in-memory database as fallback. Data will not be persisted between application restarts!")
|
||||
|
||||
continue
|
||||
except Exception as fallback_error:
|
||||
app.logger.error(f"Error setting up fallback database: {str(fallback_error)}")
|
||||
```
|
||||
|
||||
This provides a last resort option if all other attempts to create the database fail. The in-memory database won't persist data between application restarts, but it will at least allow the application to start and function temporarily.
|
||||
|
||||
### 5. User-Visible Warning for Fallback Database
|
||||
|
||||
Added a warning message that is displayed in the application if the fallback database is being used:
|
||||
|
||||
```python
|
||||
# Add a context processor to make the fallback warning available to all templates
|
||||
@app.context_processor
|
||||
def inject_db_fallback_warning():
|
||||
"""Inject the database fallback warning into all templates."""
|
||||
return {
|
||||
'db_fallback_warning': app.config.get('DB_FALLBACK_WARNING', False)
|
||||
}
|
||||
```
|
||||
|
||||
And in the base.html template:
|
||||
|
||||
```html
|
||||
<!-- Database Fallback Warning -->
|
||||
{% if db_fallback_warning %}
|
||||
<div class="flash-messages">
|
||||
<div class="flash-message error" style="background-color: #f85149; color: white; font-weight: bold;">
|
||||
WARNING: Using in-memory database as fallback. Data will not be persisted between application restarts!
|
||||
<br>
|
||||
<span style="font-size: 0.9em;">Please check the application logs for details on how to fix this issue.</span>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
```
|
||||
|
||||
This helps users understand the implications of using the fallback database and encourages them to fix the underlying issue.
|
||||
|
||||
## How to Verify the Solution
|
||||
|
||||
1. Run the application:
|
||||
```
|
||||
python main.py
|
||||
```
|
||||
|
||||
2. Verify that the application starts without any database-related errors.
|
||||
|
||||
3. If the application is using the fallback database, you should see a warning message at the top of the page.
|
||||
|
||||
4. Check the logs for any error messages related to database initialization.
|
||||
|
||||
## Preventing Similar Issues in the Future
|
||||
|
||||
To prevent similar issues in the future:
|
||||
|
||||
1. Always ensure that directories exist before trying to create files in them.
|
||||
2. Use absolute paths for database files to avoid issues with relative paths.
|
||||
3. Test if directories and files are writable before attempting operations on them.
|
||||
4. Add proper error handling and logging to help diagnose issues.
|
||||
5. Implement retry mechanisms for database connections to handle transient issues.
|
||||
6. Provide fallback options for critical components to ensure the application can still function.
|
||||
7. Add user-visible warnings for fallback modes to encourage fixing the underlying issues.
|
50
README_FIX.md
Normal file
50
README_FIX.md
Normal file
|
@ -0,0 +1,50 @@
|
|||
# Fix for UnboundLocalError in application.py
|
||||
|
||||
## Problem
|
||||
|
||||
The application was encountering the following error when starting up:
|
||||
|
||||
```
|
||||
Traceback (most recent call last):
|
||||
File "C:\Users\cody\PycharmProjects\Podcastrr\main.py", line 47, in <module>
|
||||
main()
|
||||
File "C:\Users\cody\PycharmProjects\Podcastrr\main.py", line 26, in main
|
||||
app = create_app()
|
||||
^^^^^^^^^^^^
|
||||
File "C:\Users\cody\PycharmProjects\Podcastrr\application.py", line 25, in create_app
|
||||
SECRET_KEY=os.environ.get('SECRET_KEY', 'dev'),
|
||||
^^
|
||||
UnboundLocalError: cannot access local variable 'os' where it is not associated with a value
|
||||
```
|
||||
|
||||
## Root Cause
|
||||
|
||||
The error was caused by a scope issue with the `os` module in `application.py`. The module was imported at the top of the file (global scope), but it was also imported again inside the `app_context()` block (local scope).
|
||||
|
||||
When Python sees a variable being assigned in a function (which includes imports), it treats that variable as local to the function. This means that when the code tried to access `os.environ.get()` before the local import was executed, Python raised an `UnboundLocalError` because it saw that `os` would be defined as a local variable later in the function, but it wasn't yet defined at the point of use.
|
||||
|
||||
## Solution
|
||||
|
||||
The solution was to remove the redundant import of `os` inside the `app_context()` block. The `os` module was already imported at the top of the file, so there was no need to import it again.
|
||||
|
||||
### Changes Made
|
||||
|
||||
In `application.py`, removed the following line:
|
||||
|
||||
```python
|
||||
import os
|
||||
```
|
||||
|
||||
from inside the `app_context()` block (around line 72).
|
||||
|
||||
## Verification
|
||||
|
||||
After making this change, the application should start up without encountering the `UnboundLocalError`. The `os` module from the global scope will be used throughout the function, which resolves the error.
|
||||
|
||||
## Preventing Similar Issues in the Future
|
||||
|
||||
To prevent similar issues in the future:
|
||||
|
||||
1. Avoid importing the same module multiple times in different scopes
|
||||
2. Be careful with variable names that might shadow global imports
|
||||
3. When possible, import all modules at the top of the file
|
67
README_SOLUTION_DB_PATH.md
Normal file
67
README_SOLUTION_DB_PATH.md
Normal file
|
@ -0,0 +1,67 @@
|
|||
# Solution to Database Path Issue
|
||||
|
||||
## Problem
|
||||
|
||||
The application was encountering the following errors when starting up:
|
||||
|
||||
```
|
||||
Error running migration add_episode_ordering.py: no such table: podcasts
|
||||
Error running migration add_podcast_tags.py: no such table: podcasts
|
||||
Error running migration add_season_explicit_naming_format.py: no such table: episodes
|
||||
```
|
||||
|
||||
And when accessing podcast pages:
|
||||
|
||||
```
|
||||
sqlite3.OperationalError: no such column: podcasts.tags
|
||||
```
|
||||
|
||||
## Root Cause
|
||||
|
||||
The issue was caused by a mismatch between where the application was looking for the database file and where the database file was actually located:
|
||||
|
||||
1. The application was configured to look for the database file at `sqlite:///podcastrr.db`, which is a relative path to a file in the root directory.
|
||||
2. However, the actual database file was located in the `instance` directory (`instance/podcastrr.db`).
|
||||
3. This caused the migrations to fail because they couldn't find the tables they were trying to modify.
|
||||
|
||||
## Solution
|
||||
|
||||
The solution was to update the database path in the application configuration to point to the correct location:
|
||||
|
||||
1. Modified `application.py` to change the default database path from `sqlite:///podcastrr.db` to `sqlite:///instance/podcastrr.db`.
|
||||
2. This ensures that the application and all migrations look for the database file in the `instance` directory, which is where Flask stores instance-specific files by default.
|
||||
|
||||
## Changes Made
|
||||
|
||||
In `application.py`, the following change was made:
|
||||
|
||||
```python
|
||||
# Before
|
||||
SQLALCHEMY_DATABASE_URI=os.environ.get('DATABASE_URI', 'sqlite:///podcastrr.db')
|
||||
|
||||
# After
|
||||
SQLALCHEMY_DATABASE_URI=os.environ.get('DATABASE_URI', 'sqlite:///instance/podcastrr.db')
|
||||
```
|
||||
|
||||
## How to Verify the Solution
|
||||
|
||||
1. Run the application:
|
||||
```
|
||||
python main.py
|
||||
```
|
||||
|
||||
2. Verify that the application starts without any database-related errors.
|
||||
|
||||
3. Access a podcast page to verify that the "no such column: podcasts.tags" error is resolved.
|
||||
|
||||
## Preventing Similar Issues in the Future
|
||||
|
||||
To prevent similar issues in the future:
|
||||
|
||||
1. Always use consistent database paths across the application.
|
||||
2. Consider using Flask's built-in `app.instance_path` to get the correct instance directory path.
|
||||
3. Update the `.env.example` file to reflect the correct database path:
|
||||
```
|
||||
DATABASE_URI=sqlite:///instance/podcastrr.db
|
||||
```
|
||||
4. Document the expected location of the database file in the README.md file.
|
124
SOLUTION.md
Normal file
124
SOLUTION.md
Normal file
|
@ -0,0 +1,124 @@
|
|||
# Solution to Database Migration Errors
|
||||
|
||||
## Problems
|
||||
|
||||
### "no such column: podcasts.tags" Error
|
||||
|
||||
The application was encountering a SQLite error when accessing podcast pages:
|
||||
|
||||
```
|
||||
sqlite3.OperationalError: no such column: podcasts.tags
|
||||
```
|
||||
|
||||
This error occurred because the database schema was out of date and missing the `tags` column in the `podcasts` table. The column was added to the model in the code, but the migration to add it to the database hadn't been applied.
|
||||
|
||||
### "no such table: episodes" Error
|
||||
|
||||
The application was also encountering an error when starting up:
|
||||
|
||||
```
|
||||
Error running migration: no such table: episodes
|
||||
```
|
||||
|
||||
This error occurred because the migration script was trying to modify the episodes table before it had been created. The migrations were being run during application startup, but before the database tables were created.
|
||||
|
||||
## Root Causes
|
||||
|
||||
### "no such column: podcasts.tags" Error
|
||||
|
||||
This issue was caused by a combination of factors:
|
||||
|
||||
1. The `tags` column was added to the `Podcast` model in `app/models/podcast.py`
|
||||
2. A migration script (`migrations/add_podcast_tags.py`) was created to add the column to the database
|
||||
3. The migration script was included in `application.py` to run during application startup
|
||||
4. However, the migration wasn't being applied to the database, possibly due to:
|
||||
- The application not being restarted after the migration was added
|
||||
- An import error in `init_db.py` preventing proper database initialization
|
||||
|
||||
### "no such table: episodes" Error
|
||||
|
||||
This issue was caused by the order of operations in the application startup process:
|
||||
|
||||
1. The migration scripts were being run in `application.py` during the `create_app()` function
|
||||
2. The database tables were being created in `main.py` after `create_app()` was called
|
||||
3. This meant that migrations were trying to modify tables before they were created
|
||||
4. Specifically, the `add_season_explicit_naming_format.py` migration was trying to add columns to the `episodes` table before it existed
|
||||
|
||||
## Solutions
|
||||
|
||||
### "no such column: podcasts.tags" Error
|
||||
|
||||
The solution for this issue involved several components:
|
||||
|
||||
1. **Fixed Import Error**: Corrected the import statement in `init_db.py` to properly import `create_app` from `application.py` instead of from `app`.
|
||||
|
||||
2. **Created Migration Runner**: Developed a dedicated script (`run_migrations.py`) to run all migrations, ensuring the `tags` column is added to the database.
|
||||
|
||||
3. **Added Testing Tool**: Created a test script (`test_migration.py`) to verify if the `tags` column exists and offer to run the migration if needed.
|
||||
|
||||
4. **Documented the Process**: Created a comprehensive migration guide (`MIGRATION_GUIDE.md`) explaining how to resolve the issue and handle future migrations.
|
||||
|
||||
5. **Updated README**: Added information about the migration process to the README.md file, ensuring users are aware of how to handle database updates.
|
||||
|
||||
### "no such table: episodes" Error
|
||||
|
||||
The solution for this issue involved changing the order of operations during application startup:
|
||||
|
||||
1. **Modified Database Initialization**: Updated `application.py` to create all database tables before running any migrations, ensuring that tables exist before migrations try to modify them.
|
||||
|
||||
2. **Removed Redundant Code**: Removed the redundant `db.create_all()` call from `main.py` since tables are now created in `application.py`.
|
||||
|
||||
3. **Improved Migration Handling**: Modified `application.py` to use a more robust approach for running migrations, similar to what's used in `init_db.py`. Now it dynamically discovers and runs all migration scripts in the `migrations` directory.
|
||||
|
||||
4. **Updated Documentation**: Updated the migration guide to include information about this error and how it was fixed.
|
||||
|
||||
## How to Use the Solution
|
||||
|
||||
### For the "no such column: podcasts.tags" Error
|
||||
|
||||
If you encounter this error when accessing podcast pages:
|
||||
|
||||
1. Run the migration script:
|
||||
```
|
||||
python run_migrations.py
|
||||
```
|
||||
|
||||
2. Alternatively, you can test if the migration is needed:
|
||||
```
|
||||
python test_migration.py
|
||||
```
|
||||
|
||||
3. If you're still having issues, reinitialize the database:
|
||||
```
|
||||
python init_db.py
|
||||
```
|
||||
|
||||
4. Restart the application:
|
||||
```
|
||||
python main.py
|
||||
```
|
||||
|
||||
### For the "no such table: episodes" Error
|
||||
|
||||
If you encounter this error when starting the application:
|
||||
|
||||
1. Update to the latest version of the application, which includes the fix for this issue.
|
||||
|
||||
2. If you're still experiencing the error, run the initialization script:
|
||||
```
|
||||
python init_db.py
|
||||
```
|
||||
|
||||
3. Restart the application:
|
||||
```
|
||||
python main.py
|
||||
```
|
||||
|
||||
## Preventing Similar Issues in the Future
|
||||
|
||||
To prevent similar issues in the future:
|
||||
|
||||
1. Always run `python run_migrations.py` after pulling updates that might include database changes
|
||||
2. Follow the guidelines in the Migration Guide when adding new database fields
|
||||
3. Use the test script to verify database schema changes
|
||||
4. Consider implementing a more robust migration system (like Alembic) for larger projects
|
|
@ -22,6 +22,7 @@ class Podcast(db.Model):
|
|||
auto_download = db.Column(db.Boolean, default=False)
|
||||
naming_format = db.Column(db.String(255), nullable=True) # If null, use global settings
|
||||
episode_ordering = db.Column(db.String(20), default='absolute') # 'absolute' or 'season_episode'
|
||||
tags = db.Column(db.String(512), nullable=True) # Comma-separated list of tags
|
||||
|
||||
# Relationships
|
||||
episodes = db.relationship('Episode', backref='podcast', lazy='dynamic', cascade='all, delete-orphan')
|
||||
|
@ -45,9 +46,49 @@ class Podcast(db.Model):
|
|||
'last_checked': self.last_checked.isoformat() if self.last_checked else None,
|
||||
'auto_download': self.auto_download,
|
||||
'naming_format': self.naming_format,
|
||||
'tags': self.tags.split(',') if self.tags else [],
|
||||
'episode_count': self.episodes.count()
|
||||
}
|
||||
|
||||
def get_tags(self):
|
||||
"""
|
||||
Get the list of tags for this podcast.
|
||||
|
||||
Returns:
|
||||
list: List of tags.
|
||||
"""
|
||||
return [tag.strip() for tag in self.tags.split(',')] if self.tags else []
|
||||
|
||||
def add_tag(self, tag):
|
||||
"""
|
||||
Add a tag to this podcast.
|
||||
|
||||
Args:
|
||||
tag (str): Tag to add.
|
||||
"""
|
||||
if not tag:
|
||||
return
|
||||
|
||||
tags = self.get_tags()
|
||||
if tag not in tags:
|
||||
tags.append(tag)
|
||||
self.tags = ','.join(tags)
|
||||
|
||||
def remove_tag(self, tag):
|
||||
"""
|
||||
Remove a tag from this podcast.
|
||||
|
||||
Args:
|
||||
tag (str): Tag to remove.
|
||||
"""
|
||||
if not tag:
|
||||
return
|
||||
|
||||
tags = self.get_tags()
|
||||
if tag in tags:
|
||||
tags.remove(tag)
|
||||
self.tags = ','.join(tags) if tags else None
|
||||
|
||||
class Episode(db.Model):
|
||||
"""
|
||||
Model representing a podcast episode.
|
||||
|
|
155
app/services/opml_handler.py
Normal file
155
app/services/opml_handler.py
Normal file
|
@ -0,0 +1,155 @@
|
|||
"""
|
||||
OPML import/export functionality for Podcastrr.
|
||||
"""
|
||||
import xml.etree.ElementTree as ET
|
||||
from xml.dom import minidom
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from flask import current_app
|
||||
|
||||
# Set up logging
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def parse_opml(opml_content):
|
||||
"""
|
||||
Parse OPML content and extract podcast feed URLs.
|
||||
|
||||
Args:
|
||||
opml_content (str): OPML file content.
|
||||
|
||||
Returns:
|
||||
list: List of dictionaries containing podcast information.
|
||||
"""
|
||||
try:
|
||||
root = ET.fromstring(opml_content)
|
||||
|
||||
# Find all outline elements that represent podcasts
|
||||
podcasts = []
|
||||
|
||||
# Look for outlines in the body
|
||||
body = root.find('body')
|
||||
if body is None:
|
||||
logger.error("OPML file has no body element")
|
||||
return []
|
||||
|
||||
# Process all outline elements
|
||||
for outline in body.findall('.//outline'):
|
||||
# Check if this is a podcast outline (has xmlUrl attribute)
|
||||
xml_url = outline.get('xmlUrl')
|
||||
if xml_url:
|
||||
podcast = {
|
||||
'feed_url': xml_url,
|
||||
'title': outline.get('title') or outline.get('text', 'Unknown Podcast'),
|
||||
'description': outline.get('description', ''),
|
||||
'html_url': outline.get('htmlUrl', '')
|
||||
}
|
||||
podcasts.append(podcast)
|
||||
|
||||
logger.info(f"Parsed OPML file and found {len(podcasts)} podcasts")
|
||||
return podcasts
|
||||
except Exception as e:
|
||||
logger.error(f"Error parsing OPML file: {str(e)}")
|
||||
return []
|
||||
|
||||
def generate_opml(podcasts):
|
||||
"""
|
||||
Generate OPML content from a list of podcasts.
|
||||
|
||||
Args:
|
||||
podcasts (list): List of Podcast model instances.
|
||||
|
||||
Returns:
|
||||
str: OPML file content.
|
||||
"""
|
||||
try:
|
||||
# Create the root element
|
||||
root = ET.Element('opml')
|
||||
root.set('version', '2.0')
|
||||
|
||||
# Create the head element
|
||||
head = ET.SubElement(root, 'head')
|
||||
title = ET.SubElement(head, 'title')
|
||||
title.text = 'Podcastrr Subscriptions'
|
||||
date_created = ET.SubElement(head, 'dateCreated')
|
||||
date_created.text = datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
|
||||
|
||||
# Create the body element
|
||||
body = ET.SubElement(root, 'body')
|
||||
|
||||
# Add each podcast as an outline element
|
||||
for podcast in podcasts:
|
||||
outline = ET.SubElement(body, 'outline')
|
||||
outline.set('type', 'rss')
|
||||
outline.set('text', podcast.title)
|
||||
outline.set('title', podcast.title)
|
||||
outline.set('xmlUrl', podcast.feed_url)
|
||||
if podcast.description:
|
||||
outline.set('description', podcast.description)
|
||||
|
||||
# Convert to pretty-printed XML
|
||||
xml_str = ET.tostring(root, encoding='utf-8')
|
||||
parsed_xml = minidom.parseString(xml_str)
|
||||
pretty_xml = parsed_xml.toprettyxml(indent=" ")
|
||||
|
||||
logger.info(f"Generated OPML file with {len(podcasts)} podcasts")
|
||||
return pretty_xml
|
||||
except Exception as e:
|
||||
logger.error(f"Error generating OPML file: {str(e)}")
|
||||
return ""
|
||||
|
||||
def import_podcasts_from_opml(opml_content):
|
||||
"""
|
||||
Import podcasts from OPML content into the database.
|
||||
|
||||
Args:
|
||||
opml_content (str): OPML file content.
|
||||
|
||||
Returns:
|
||||
dict: Statistics about the import process.
|
||||
"""
|
||||
from app.models.podcast import Podcast
|
||||
from app.models.database import db
|
||||
from app.services.podcast_updater import update_podcast
|
||||
|
||||
podcasts = parse_opml(opml_content)
|
||||
|
||||
stats = {
|
||||
'total': len(podcasts),
|
||||
'imported': 0,
|
||||
'skipped': 0,
|
||||
'errors': 0
|
||||
}
|
||||
|
||||
for podcast_data in podcasts:
|
||||
try:
|
||||
# Check if podcast already exists
|
||||
existing = Podcast.query.filter_by(feed_url=podcast_data['feed_url']).first()
|
||||
|
||||
if existing:
|
||||
logger.info(f"Podcast already exists: {podcast_data['title']}")
|
||||
stats['skipped'] += 1
|
||||
continue
|
||||
|
||||
# Create new podcast
|
||||
podcast = Podcast(
|
||||
title=podcast_data['title'],
|
||||
description=podcast_data.get('description', ''),
|
||||
feed_url=podcast_data['feed_url']
|
||||
)
|
||||
|
||||
db.session.add(podcast)
|
||||
db.session.commit()
|
||||
|
||||
# Update podcast to fetch episodes
|
||||
try:
|
||||
update_podcast(podcast.id)
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating podcast {podcast.title}: {str(e)}")
|
||||
|
||||
stats['imported'] += 1
|
||||
logger.info(f"Imported podcast: {podcast.title}")
|
||||
except Exception as e:
|
||||
stats['errors'] += 1
|
||||
logger.error(f"Error importing podcast: {str(e)}")
|
||||
|
||||
return stats
|
|
@ -173,6 +173,8 @@ def format_filename(format_string, podcast, episode):
|
|||
# If episode_number exists but is not a digit, format as S01E{episode_number}
|
||||
else f"S{episode.season or 1:02d}E{episode.episode_number}"
|
||||
if episode.episode_number
|
||||
# If neither season nor episode_number are available, use published date
|
||||
else episode.published_date.strftime('%Y-%m-%d') if episode.published_date
|
||||
# Otherwise, return empty string
|
||||
else ''
|
||||
),
|
||||
|
@ -195,10 +197,23 @@ def format_filename(format_string, podcast, episode):
|
|||
|
||||
# Handle empty path segments by removing them
|
||||
path_parts = formatted_path.split(os.path.sep)
|
||||
path_parts = [part for part in path_parts if part.strip()]
|
||||
|
||||
# Remove empty segments and segments that would be just placeholders without values
|
||||
cleaned_parts = []
|
||||
for part in path_parts:
|
||||
part = part.strip()
|
||||
if not part:
|
||||
continue
|
||||
# Check for common placeholders without values
|
||||
if part in ["Season ", "Season", "Episode ", "Episode", "E", "S"]:
|
||||
continue
|
||||
# Check for patterns like "S01E" without an episode number
|
||||
if part.startswith("S") and part.endswith("E") and len(part) > 2:
|
||||
continue
|
||||
cleaned_parts.append(part)
|
||||
|
||||
# Rejoin the path with proper separators
|
||||
return os.path.sep.join(path_parts)
|
||||
return os.path.sep.join(cleaned_parts)
|
||||
|
||||
def sanitize_filename(filename):
|
||||
"""
|
||||
|
@ -277,6 +292,7 @@ def delete_old_episodes(days=30):
|
|||
def verify_downloaded_episodes(podcast_id=None, progress_callback=None):
|
||||
"""
|
||||
Verify that downloaded episodes still exist on disk and update their status.
|
||||
Also checks for existing files for episodes that aren't marked as downloaded.
|
||||
|
||||
Args:
|
||||
podcast_id (int, optional): ID of the podcast to check. If None, check all podcasts.
|
||||
|
@ -286,23 +302,24 @@ def verify_downloaded_episodes(podcast_id=None, progress_callback=None):
|
|||
dict: Statistics about the verification process.
|
||||
"""
|
||||
from app.models.podcast import Episode, Podcast
|
||||
from app.models.settings import Settings
|
||||
|
||||
# Get episodes to check
|
||||
# First, verify episodes that are marked as downloaded
|
||||
query = Episode.query.filter(Episode.downloaded == True)
|
||||
if podcast_id:
|
||||
query = query.filter(Episode.podcast_id == podcast_id)
|
||||
|
||||
episodes = query.all()
|
||||
total = len(episodes)
|
||||
downloaded_episodes = query.all()
|
||||
total_downloaded = len(downloaded_episodes)
|
||||
|
||||
if progress_callback:
|
||||
progress_callback(0, f"Verifying {total} downloaded episodes")
|
||||
progress_callback(0, f"Verifying {total_downloaded} downloaded episodes")
|
||||
|
||||
missing = 0
|
||||
for i, episode in enumerate(episodes):
|
||||
if progress_callback and total > 0:
|
||||
progress = int((i / total) * 100)
|
||||
progress_callback(progress, f"Verifying episode {i+1}/{total}")
|
||||
for i, episode in enumerate(downloaded_episodes):
|
||||
if progress_callback and total_downloaded > 0:
|
||||
progress = int((i / total_downloaded) * 50) # Use first half of progress for verification
|
||||
progress_callback(progress, f"Verifying episode {i+1}/{total_downloaded}")
|
||||
|
||||
if not episode.file_path or not os.path.exists(episode.file_path):
|
||||
episode.downloaded = False
|
||||
|
@ -312,15 +329,133 @@ def verify_downloaded_episodes(podcast_id=None, progress_callback=None):
|
|||
|
||||
db.session.commit()
|
||||
|
||||
if progress_callback:
|
||||
progress_callback(100, f"Verification complete. {missing} episodes marked as not downloaded.")
|
||||
# Now check for existing files for episodes that aren't marked as downloaded
|
||||
query = Episode.query.filter(Episode.downloaded == False)
|
||||
if podcast_id:
|
||||
query = query.filter(Episode.podcast_id == podcast_id)
|
||||
|
||||
logger.info(f"Verified {total} episodes. {missing} were missing.")
|
||||
undownloaded_episodes = query.all()
|
||||
total_undownloaded = len(undownloaded_episodes)
|
||||
|
||||
if progress_callback:
|
||||
progress_callback(50, f"Checking for existing files for {total_undownloaded} undownloaded episodes")
|
||||
|
||||
found = 0
|
||||
if total_undownloaded > 0 and podcast_id:
|
||||
# Get the podcast
|
||||
podcast = Podcast.query.get(podcast_id)
|
||||
if not podcast:
|
||||
logger.error(f"Podcast with ID {podcast_id} not found")
|
||||
return {
|
||||
'total_checked': total_downloaded,
|
||||
'missing': missing,
|
||||
'found': 0
|
||||
}
|
||||
|
||||
# Get settings
|
||||
settings = Settings.query.first()
|
||||
if not settings:
|
||||
settings = Settings(
|
||||
download_path=current_app.config['DOWNLOAD_PATH'],
|
||||
naming_format="{podcast_title}/{episode_title}"
|
||||
)
|
||||
db.session.add(settings)
|
||||
db.session.commit()
|
||||
|
||||
# Use podcast's naming format if available, otherwise use global settings
|
||||
naming_format = podcast.naming_format or settings.naming_format
|
||||
download_path = settings.download_path
|
||||
|
||||
# Check each undownloaded episode for existing files
|
||||
for i, episode in enumerate(undownloaded_episodes):
|
||||
if progress_callback:
|
||||
progress = 50 + int((i / total_undownloaded) * 50) # Use second half of progress for file matching
|
||||
progress_callback(progress, f"Checking for file for episode {i+1}/{total_undownloaded}")
|
||||
|
||||
try:
|
||||
# Format filename using the naming format
|
||||
filename = format_filename(naming_format, podcast, episode)
|
||||
|
||||
# Check for common audio file extensions
|
||||
extensions = ['.mp3', '.m4a', '.ogg', '.wav']
|
||||
for ext in extensions:
|
||||
file_path = os.path.normpath(os.path.join(download_path, filename + ext))
|
||||
if os.path.exists(file_path):
|
||||
logger.info(f"Found existing file for episode: {file_path}")
|
||||
episode.downloaded = True
|
||||
episode.file_path = file_path
|
||||
found += 1
|
||||
break
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking for existing file for episode {episode.title}: {str(e)}")
|
||||
|
||||
db.session.commit()
|
||||
|
||||
if progress_callback:
|
||||
progress_callback(100, f"Verification complete. {missing} episodes marked as not downloaded, {found} files matched.")
|
||||
|
||||
logger.info(f"Verified {total_downloaded} episodes. {missing} were missing. Found files for {found} undownloaded episodes.")
|
||||
return {
|
||||
'total_checked': total,
|
||||
'missing': missing
|
||||
'total_checked': total_downloaded,
|
||||
'missing': missing,
|
||||
'found': found
|
||||
}
|
||||
|
||||
def download_all_episodes(podcast_id, progress_callback=None):
|
||||
"""
|
||||
Download all episodes of a podcast that haven't been downloaded yet.
|
||||
|
||||
Args:
|
||||
podcast_id: ID of the Podcast to download all episodes for.
|
||||
progress_callback (callable, optional): Callback function for progress updates.
|
||||
|
||||
Returns:
|
||||
dict: Statistics about the download process.
|
||||
"""
|
||||
from app.models.podcast import Podcast, Episode
|
||||
|
||||
if progress_callback:
|
||||
progress_callback(2, "Loading podcast data")
|
||||
|
||||
# Load the podcast
|
||||
podcast = Podcast.query.get(podcast_id)
|
||||
if not podcast:
|
||||
raise ValueError(f"Podcast with ID {podcast_id} not found")
|
||||
|
||||
# Get all episodes that haven't been downloaded yet
|
||||
episodes = Episode.query.filter_by(podcast_id=podcast_id, downloaded=False).all()
|
||||
total_episodes = len(episodes)
|
||||
|
||||
if progress_callback:
|
||||
progress_callback(5, f"Found {total_episodes} episodes to download")
|
||||
|
||||
if total_episodes == 0:
|
||||
if progress_callback:
|
||||
progress_callback(100, "No episodes to download")
|
||||
return {"total": 0, "downloaded": 0, "failed": 0}
|
||||
|
||||
stats = {"total": total_episodes, "downloaded": 0, "failed": 0}
|
||||
|
||||
# Download each episode
|
||||
for i, episode in enumerate(episodes):
|
||||
if progress_callback:
|
||||
progress = 5 + int((i / total_episodes) * 90) # Scale from 5% to 95%
|
||||
progress_callback(progress, f"Downloading episode {i+1}/{total_episodes}: {episode.title}")
|
||||
|
||||
try:
|
||||
download_episode(episode.id)
|
||||
stats["downloaded"] += 1
|
||||
logger.info(f"Downloaded episode {i+1}/{total_episodes}: {episode.title}")
|
||||
except Exception as e:
|
||||
stats["failed"] += 1
|
||||
logger.error(f"Error downloading episode {episode.title}: {str(e)}")
|
||||
|
||||
if progress_callback:
|
||||
progress_callback(100, f"Download complete. Downloaded {stats['downloaded']} episodes, {stats['failed']} failed.")
|
||||
|
||||
logger.info(f"Podcast archive download completed: {stats}")
|
||||
return stats
|
||||
|
||||
def rename_episode(episode_id, new_format=None, progress_callback=None):
|
||||
"""
|
||||
Rename a downloaded episode file using a new format.
|
||||
|
|
|
@ -142,15 +142,126 @@ def get_podcast_episodes(feed_url):
|
|||
'published_date': _parse_date(entry.get('published')),
|
||||
'guid': entry.get('id', ''),
|
||||
'duration': _parse_duration(entry.get('itunes_duration', '')),
|
||||
'season': entry.get('itunes_season'), # Season number
|
||||
'episode_number': entry.get('itunes_episode', ''), # Episode number within season
|
||||
'season': None, # Default to None
|
||||
'episode_number': None, # Default to None, will try to extract from various sources
|
||||
'explicit': False # Default to False
|
||||
}
|
||||
|
||||
# Handle explicit flag safely
|
||||
itunes_explicit = entry.get('itunes_explicit', '')
|
||||
if isinstance(itunes_explicit, str) and itunes_explicit:
|
||||
episode['explicit'] = itunes_explicit.lower() == 'yes'
|
||||
# Handle season tag - try multiple ways to access it
|
||||
try:
|
||||
# Try as attribute first
|
||||
if hasattr(entry, 'itunes_season'):
|
||||
episode['season'] = int(entry.itunes_season) if entry.itunes_season else None
|
||||
logger.debug(f"Found season as attribute: {episode['season']}")
|
||||
# Try as dictionary key
|
||||
elif entry.get('itunes_season'):
|
||||
episode['season'] = int(entry.get('itunes_season')) if entry.get('itunes_season') else None
|
||||
logger.debug(f"Found season as dict key: {episode['season']}")
|
||||
# Try looking in tags
|
||||
elif hasattr(entry, 'tags'):
|
||||
for tag in entry.tags:
|
||||
if tag.get('term', '').startswith('Season'):
|
||||
try:
|
||||
episode['season'] = int(tag.get('term').replace('Season', '').strip())
|
||||
logger.debug(f"Found season in tags: {episode['season']}")
|
||||
break
|
||||
except (ValueError, TypeError):
|
||||
pass
|
||||
except Exception as e:
|
||||
logger.warning(f"Error parsing season: {str(e)}")
|
||||
|
||||
# Handle episode number - try multiple ways to access it
|
||||
try:
|
||||
# Try as attribute first (itunes_episode)
|
||||
if hasattr(entry, 'itunes_episode') and entry.itunes_episode:
|
||||
episode['episode_number'] = entry.itunes_episode
|
||||
logger.debug(f"Found episode number as attribute: {episode['episode_number']}")
|
||||
# Try as dictionary key
|
||||
elif entry.get('itunes_episode'):
|
||||
episode['episode_number'] = entry.get('itunes_episode')
|
||||
logger.debug(f"Found episode number as dict key: {episode['episode_number']}")
|
||||
# Try to extract from title if it contains "Episode X" or "Ep X" or "#X"
|
||||
elif episode['title']:
|
||||
import re
|
||||
# Common patterns for episode numbers in titles
|
||||
patterns = [
|
||||
r'Episode\s+(\d+)', # "Episode 123"
|
||||
r'Ep\s*(\d+)', # "Ep123" or "Ep 123"
|
||||
r'#(\d+)', # "#123"
|
||||
r'E(\d+)', # "E123" or "S1E123"
|
||||
]
|
||||
|
||||
for pattern in patterns:
|
||||
match = re.search(pattern, episode['title'], re.IGNORECASE)
|
||||
if match:
|
||||
episode['episode_number'] = match.group(1)
|
||||
logger.debug(f"Extracted episode number from title: {episode['episode_number']}")
|
||||
break
|
||||
except Exception as e:
|
||||
logger.warning(f"Error parsing episode number: {str(e)}")
|
||||
|
||||
# Handle explicit flag - try multiple ways to access it
|
||||
try:
|
||||
# Try as attribute first
|
||||
if hasattr(entry, 'itunes_explicit'):
|
||||
explicit_value = entry.itunes_explicit
|
||||
if isinstance(explicit_value, str):
|
||||
episode['explicit'] = explicit_value.lower() in ('yes', 'true')
|
||||
logger.debug(f"Found explicit as attribute: {episode['explicit']}")
|
||||
# Try as dictionary key
|
||||
elif entry.get('itunes_explicit'):
|
||||
explicit_value = entry.get('itunes_explicit')
|
||||
if isinstance(explicit_value, str):
|
||||
episode['explicit'] = explicit_value.lower() in ('yes', 'true')
|
||||
logger.debug(f"Found explicit as dict key: {episode['explicit']}")
|
||||
except Exception as e:
|
||||
logger.warning(f"Error parsing explicit flag: {str(e)}")
|
||||
|
||||
# Handle the different combinations of season and episode numbers
|
||||
# Case 1: No season, no episode - use published date to create a sequential order
|
||||
if episode['season'] is None and (episode['episode_number'] is None or episode['episode_number'] == ''):
|
||||
if episode['published_date']:
|
||||
# Use the publication date to create a pseudo-episode number
|
||||
# Format: YYYYMMDD (e.g., 20230101 for January 1, 2023)
|
||||
episode['episode_number'] = episode['published_date'].strftime('%Y%m%d')
|
||||
logger.debug(f"No season or episode number, using date as episode number: {episode['episode_number']}")
|
||||
else:
|
||||
# If no publication date, use a placeholder
|
||||
episode['episode_number'] = "unknown"
|
||||
logger.debug("No season, episode number, or date available")
|
||||
|
||||
# Case 2: No season, but episode number exists - keep episode number as is
|
||||
elif episode['season'] is None and episode['episode_number'] is not None:
|
||||
logger.debug(f"Using episode number without season: {episode['episode_number']}")
|
||||
|
||||
# Case 3: Season exists, no episode number - use season as prefix for ordering
|
||||
elif episode['season'] is not None and (episode['episode_number'] is None or episode['episode_number'] == ''):
|
||||
if episode['published_date']:
|
||||
# Use the publication date with season prefix
|
||||
# Format: S01_YYYYMMDD
|
||||
episode['episode_number'] = f"S{episode['season']:02d}_{episode['published_date'].strftime('%Y%m%d')}"
|
||||
logger.debug(f"Season without episode number, using season+date: {episode['episode_number']}")
|
||||
else:
|
||||
# If no publication date, use season with unknown suffix
|
||||
episode['episode_number'] = f"S{episode['season']:02d}_unknown"
|
||||
logger.debug(f"Season without episode number or date: {episode['episode_number']}")
|
||||
|
||||
# Case 4: Both season and episode exist - format as S01E02
|
||||
elif episode['season'] is not None and episode['episode_number'] is not None:
|
||||
# Check if episode_number is already formatted as S01E02
|
||||
import re
|
||||
if not re.match(r'^S\d+E\d+$', str(episode['episode_number']), re.IGNORECASE):
|
||||
try:
|
||||
# Try to convert episode_number to integer for proper formatting
|
||||
ep_num = int(episode['episode_number'])
|
||||
episode['episode_number'] = f"S{episode['season']:02d}E{ep_num:02d}"
|
||||
logger.debug(f"Formatted season and episode as: {episode['episode_number']}")
|
||||
except (ValueError, TypeError):
|
||||
# If episode_number can't be converted to int, use as is with season prefix
|
||||
episode['episode_number'] = f"S{episode['season']:02d}_{episode['episode_number']}"
|
||||
logger.debug(f"Using season prefix with non-numeric episode: {episode['episode_number']}")
|
||||
else:
|
||||
logger.debug(f"Episode already formatted correctly: {episode['episode_number']}")
|
||||
|
||||
# Generate a GUID if one is not provided
|
||||
if not episode['guid']:
|
||||
|
|
|
@ -128,20 +128,60 @@ def update_podcast(podcast_id, progress_callback=None):
|
|||
published_date=episode_data.get('published_date'),
|
||||
duration=episode_data.get('duration'),
|
||||
file_size=episode_data.get('file_size'),
|
||||
season=episode_data.get('season'), # Season number
|
||||
episode_number=episode_data.get('episode_number'),
|
||||
guid=episode_data['guid'],
|
||||
downloaded=False
|
||||
downloaded=False,
|
||||
explicit=episode_data.get('explicit') # Explicit flag
|
||||
)
|
||||
|
||||
db.session.add(episode)
|
||||
stats['new_episodes'] += 1
|
||||
logger.info(f"Added new episode: {episode.title}")
|
||||
|
||||
# Auto-download if enabled
|
||||
if podcast.auto_download and episode.audio_url:
|
||||
try:
|
||||
# Need to commit first to ensure episode has an ID
|
||||
# Need to commit first to ensure episode has an ID
|
||||
db.session.commit()
|
||||
|
||||
# Check if file already exists for this episode
|
||||
try:
|
||||
from app.services.podcast_downloader import format_filename
|
||||
import os
|
||||
from app.models.settings import Settings
|
||||
|
||||
settings = Settings.query.first()
|
||||
if not settings:
|
||||
settings = Settings(
|
||||
download_path=current_app.config['DOWNLOAD_PATH'],
|
||||
naming_format="{podcast_title}/{episode_title}"
|
||||
)
|
||||
db.session.add(settings)
|
||||
db.session.commit()
|
||||
|
||||
# Use podcast's naming format if available, otherwise use global settings
|
||||
naming_format = podcast.naming_format or settings.naming_format
|
||||
|
||||
# Format filename using the naming format
|
||||
filename = format_filename(naming_format, podcast, episode)
|
||||
download_path = settings.download_path
|
||||
|
||||
# Check for common audio file extensions
|
||||
extensions = ['.mp3', '.m4a', '.ogg', '.wav']
|
||||
for ext in extensions:
|
||||
file_path = os.path.normpath(os.path.join(download_path, filename + ext))
|
||||
if os.path.exists(file_path):
|
||||
logger.info(f"Found existing file for episode: {file_path}")
|
||||
episode.downloaded = True
|
||||
episode.file_path = file_path
|
||||
db.session.commit()
|
||||
break
|
||||
|
||||
logger.info(f"Checked for existing files for episode: {episode.title}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking for existing files for episode {episode.title}: {str(e)}")
|
||||
|
||||
# Auto-download if enabled and not already downloaded
|
||||
if podcast.auto_download and episode.audio_url and not episode.downloaded:
|
||||
try:
|
||||
download_episode(episode.id)
|
||||
stats['episodes_downloaded'] += 1
|
||||
logger.info(f"Auto-downloaded episode: {episode.title}")
|
||||
|
|
|
@ -172,12 +172,12 @@ class TaskManager:
|
|||
with self.lock:
|
||||
return list(self.tasks.values())
|
||||
|
||||
def clean_old_tasks(self, max_age_seconds=60):
|
||||
def clean_old_tasks(self, max_age_seconds=86400):
|
||||
"""
|
||||
Remove old completed or failed tasks.
|
||||
|
||||
Args:
|
||||
max_age_seconds (int): Maximum age of tasks to keep in seconds
|
||||
max_age_seconds (int): Maximum age of tasks to keep in seconds (default: 24 hours)
|
||||
|
||||
Returns:
|
||||
int: Number of tasks removed
|
||||
|
|
|
@ -33,6 +33,37 @@ def dashboard():
|
|||
# Get statistics
|
||||
total_podcasts = Podcast.query.count()
|
||||
|
||||
# Get episode statistics
|
||||
from app.models.podcast import Episode
|
||||
total_episodes = Episode.query.count()
|
||||
downloaded_episodes = Episode.query.filter_by(downloaded=True).count()
|
||||
not_downloaded_episodes = total_episodes - downloaded_episodes
|
||||
|
||||
# Calculate total storage used (in bytes)
|
||||
from sqlalchemy import func
|
||||
total_storage_bytes = Episode.query.filter_by(downloaded=True).with_entities(
|
||||
func.sum(Episode.file_size)).scalar() or 0
|
||||
|
||||
# Format storage size in appropriate units
|
||||
def format_size(size_bytes):
|
||||
# Convert bytes to appropriate unit
|
||||
if size_bytes < 1024:
|
||||
return f"{size_bytes} B"
|
||||
elif size_bytes < 1024 * 1024:
|
||||
return f"{size_bytes / 1024:.2f} KB"
|
||||
elif size_bytes < 1024 * 1024 * 1024:
|
||||
return f"{size_bytes / (1024 * 1024):.2f} MB"
|
||||
elif size_bytes < 1024 * 1024 * 1024 * 1024:
|
||||
return f"{size_bytes / (1024 * 1024 * 1024):.2f} GB"
|
||||
else:
|
||||
return f"{size_bytes / (1024 * 1024 * 1024 * 1024):.2f} TB"
|
||||
|
||||
formatted_storage = format_size(total_storage_bytes)
|
||||
|
||||
return render_template('dashboard.html',
|
||||
title='Dashboard',
|
||||
total_podcasts=total_podcasts)
|
||||
total_podcasts=total_podcasts,
|
||||
total_episodes=total_episodes,
|
||||
downloaded_episodes=downloaded_episodes,
|
||||
not_downloaded_episodes=not_downloaded_episodes,
|
||||
formatted_storage=formatted_storage)
|
||||
|
|
|
@ -3,11 +3,13 @@ Podcast routes for the Podcastrr application.
|
|||
"""
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
from flask import Blueprint, render_template, request, redirect, url_for, flash, current_app
|
||||
from flask import Blueprint, render_template, request, redirect, url_for, flash, current_app, Response, send_file
|
||||
from app.models.podcast import Podcast, Episode
|
||||
from app.models.database import db
|
||||
from app.services.podcast_search import search_podcasts
|
||||
from app.services.podcast_search import search_podcasts, get_podcast_episodes
|
||||
from app.services.podcast_downloader import download_episode
|
||||
from app.services.opml_handler import generate_opml, import_podcasts_from_opml
|
||||
import io
|
||||
|
||||
podcasts_bp = Blueprint('podcasts', __name__)
|
||||
|
||||
|
@ -178,6 +180,27 @@ def update(podcast_id):
|
|||
flash(f'Update started in the background. Check the status in the tasks panel.', 'info')
|
||||
return redirect(url_for('podcasts.view', podcast_id=podcast_id))
|
||||
|
||||
@podcasts_bp.route('/download_all/<int:podcast_id>', methods=['POST'])
|
||||
def download_all(podcast_id):
|
||||
"""
|
||||
Download all episodes of a podcast in the background.
|
||||
"""
|
||||
from app.services.task_manager import task_manager
|
||||
from app.services.podcast_downloader import download_all_episodes
|
||||
|
||||
podcast = Podcast.query.get_or_404(podcast_id)
|
||||
|
||||
# Create a background task for downloading all episodes
|
||||
task_id = task_manager.create_task(
|
||||
'download_all',
|
||||
f"Downloading all episodes for podcast: {podcast.title}",
|
||||
download_all_episodes,
|
||||
podcast_id
|
||||
)
|
||||
|
||||
flash(f'Download of all episodes started in the background. Check the status in the tasks panel.', 'info')
|
||||
return redirect(url_for('podcasts.view', podcast_id=podcast_id))
|
||||
|
||||
@podcasts_bp.route('/verify/<int:podcast_id>', methods=['POST'])
|
||||
def verify(podcast_id):
|
||||
"""
|
||||
|
@ -252,3 +275,165 @@ def update_naming_format(podcast_id):
|
|||
flash(f'Naming format reset to global settings for {podcast.title}.', 'success')
|
||||
|
||||
return redirect(url_for('podcasts.view', podcast_id=podcast_id))
|
||||
|
||||
@podcasts_bp.route('/update_tags/<int:podcast_id>', methods=['POST'])
|
||||
def update_tags(podcast_id):
|
||||
"""
|
||||
Update the tags for a podcast.
|
||||
"""
|
||||
podcast = Podcast.query.get_or_404(podcast_id)
|
||||
|
||||
# Get the tags from the form
|
||||
tags = request.form.get('tags', '')
|
||||
|
||||
# Split the tags by comma and strip whitespace
|
||||
tag_list = [tag.strip() for tag in tags.split(',') if tag.strip()]
|
||||
|
||||
# Update the podcast's tags
|
||||
podcast.tags = ','.join(tag_list) if tag_list else None
|
||||
db.session.commit()
|
||||
|
||||
flash(f'Tags updated for {podcast.title}.', 'success')
|
||||
return redirect(url_for('podcasts.view', podcast_id=podcast_id))
|
||||
|
||||
@podcasts_bp.route('/tag/<string:tag>')
|
||||
def filter_by_tag(tag):
|
||||
"""
|
||||
Filter podcasts by tag.
|
||||
"""
|
||||
# Find all podcasts with the given tag
|
||||
# We need to use LIKE with wildcards because tags are stored as a comma-separated string
|
||||
podcasts = Podcast.query.filter(
|
||||
(Podcast.tags == tag) | # Exact match
|
||||
(Podcast.tags.like(f'{tag},%')) | # Tag at the beginning
|
||||
(Podcast.tags.like(f'%,{tag},%')) | # Tag in the middle
|
||||
(Podcast.tags.like(f'%,{tag}')) # Tag at the end
|
||||
).all()
|
||||
|
||||
return render_template('podcasts/index.html',
|
||||
title=f'Podcasts tagged with "{tag}"',
|
||||
podcasts=podcasts,
|
||||
current_tag=tag)
|
||||
|
||||
@podcasts_bp.route('/import_opml', methods=['GET', 'POST'])
|
||||
def import_opml():
|
||||
"""
|
||||
Import podcasts from an OPML file.
|
||||
"""
|
||||
if request.method == 'POST':
|
||||
# Check if a file was uploaded
|
||||
if 'opml_file' not in request.files:
|
||||
flash('No file selected.', 'error')
|
||||
return redirect(url_for('podcasts.index'))
|
||||
|
||||
opml_file = request.files['opml_file']
|
||||
|
||||
# Check if the file has a name
|
||||
if opml_file.filename == '':
|
||||
flash('No file selected.', 'error')
|
||||
return redirect(url_for('podcasts.index'))
|
||||
|
||||
# Check if the file is an OPML file
|
||||
if not opml_file.filename.lower().endswith('.opml') and not opml_file.filename.lower().endswith('.xml'):
|
||||
flash('Invalid file format. Please upload an OPML file.', 'error')
|
||||
return redirect(url_for('podcasts.index'))
|
||||
|
||||
# Read the file content
|
||||
opml_content = opml_file.read().decode('utf-8')
|
||||
|
||||
# Import podcasts from the OPML file
|
||||
from app.services.task_manager import task_manager
|
||||
|
||||
# Create a background task for importing
|
||||
task_id = task_manager.create_task(
|
||||
'import_opml',
|
||||
f"Importing podcasts from OPML file: {opml_file.filename}",
|
||||
import_podcasts_from_opml,
|
||||
opml_content
|
||||
)
|
||||
|
||||
flash(f'OPML import started in the background. Check the status in the tasks panel.', 'info')
|
||||
return redirect(url_for('podcasts.index'))
|
||||
|
||||
return render_template('podcasts/import_opml.html',
|
||||
title='Import OPML')
|
||||
|
||||
@podcasts_bp.route('/export_opml')
|
||||
def export_opml():
|
||||
"""
|
||||
Export podcasts to an OPML file.
|
||||
"""
|
||||
# Get all podcasts
|
||||
podcasts = Podcast.query.all()
|
||||
|
||||
# Generate OPML content
|
||||
opml_content = generate_opml(podcasts)
|
||||
|
||||
# Create a file-like object from the OPML content
|
||||
opml_file = io.BytesIO(opml_content.encode('utf-8'))
|
||||
|
||||
# Return the file as a download
|
||||
return send_file(
|
||||
opml_file,
|
||||
mimetype='application/xml',
|
||||
as_attachment=True,
|
||||
download_name='podcastrr_subscriptions.opml'
|
||||
)
|
||||
|
||||
@podcasts_bp.route('/add_by_url', methods=['POST'])
|
||||
def add_by_url():
|
||||
"""
|
||||
Add a podcast by its RSS feed URL.
|
||||
"""
|
||||
feed_url = request.form.get('feed_url', '').strip()
|
||||
|
||||
if not feed_url:
|
||||
flash('Please enter a valid RSS feed URL.', 'error')
|
||||
return redirect(url_for('podcasts.search'))
|
||||
|
||||
# Check if podcast already exists
|
||||
existing = Podcast.query.filter_by(feed_url=feed_url).first()
|
||||
|
||||
if existing:
|
||||
flash('Podcast is already being tracked.', 'info')
|
||||
return redirect(url_for('podcasts.view', podcast_id=existing.id))
|
||||
|
||||
try:
|
||||
# Try to get podcast episodes to validate the feed
|
||||
episodes = get_podcast_episodes(feed_url)
|
||||
|
||||
if not episodes:
|
||||
flash('No episodes found in the feed. Please check the URL and try again.', 'error')
|
||||
return redirect(url_for('podcasts.search'))
|
||||
|
||||
# Get the first episode to extract podcast info
|
||||
first_episode = episodes[0]
|
||||
|
||||
# Create podcast record with basic info
|
||||
podcast = Podcast(
|
||||
title=first_episode.get('podcast_title', 'Unknown Podcast'),
|
||||
feed_url=feed_url
|
||||
)
|
||||
|
||||
db.session.add(podcast)
|
||||
db.session.commit()
|
||||
|
||||
# Fetch episodes immediately after adding
|
||||
from app.services.podcast_updater import update_podcast
|
||||
|
||||
# Create a background task for updating
|
||||
from app.services.task_manager import task_manager
|
||||
|
||||
task_id = task_manager.create_task(
|
||||
'update',
|
||||
f"Fetching episodes for newly added podcast: {podcast.title}",
|
||||
update_podcast,
|
||||
podcast.id
|
||||
)
|
||||
|
||||
flash(f'Podcast added successfully! Fetching episodes in the background.', 'success')
|
||||
return redirect(url_for('podcasts.view', podcast_id=podcast.id))
|
||||
except Exception as e:
|
||||
logger.error(f"Error adding podcast by URL: {str(e)}")
|
||||
flash(f'Error adding podcast: {str(e)}', 'error')
|
||||
return redirect(url_for('podcasts.search'))
|
||||
|
|
|
@ -2,14 +2,36 @@
|
|||
Task-related routes for the Podcastrr application.
|
||||
"""
|
||||
import logging
|
||||
from flask import Blueprint, jsonify, request, current_app
|
||||
from app.services.task_manager import task_manager
|
||||
from flask import Blueprint, jsonify, request, current_app, render_template
|
||||
from app.services.task_manager import task_manager, TaskStatus
|
||||
|
||||
# Set up logging
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
tasks_bp = Blueprint('tasks', __name__)
|
||||
|
||||
@tasks_bp.route('/tasks', methods=['GET'])
|
||||
def view_tasks():
|
||||
"""
|
||||
Render the tasks page showing task history and in-progress tasks.
|
||||
"""
|
||||
tasks = task_manager.get_all_tasks()
|
||||
|
||||
# Separate tasks by status
|
||||
running_tasks = [task for task in tasks if task.status == TaskStatus.RUNNING or task.status == TaskStatus.PENDING]
|
||||
completed_tasks = [task for task in tasks if task.status == TaskStatus.COMPLETED]
|
||||
failed_tasks = [task for task in tasks if task.status == TaskStatus.FAILED]
|
||||
|
||||
# Sort tasks by created_at (newest first)
|
||||
running_tasks.sort(key=lambda x: x.created_at, reverse=True)
|
||||
completed_tasks.sort(key=lambda x: x.completed_at or x.created_at, reverse=True)
|
||||
failed_tasks.sort(key=lambda x: x.completed_at or x.created_at, reverse=True)
|
||||
|
||||
return render_template('tasks/index.html',
|
||||
running_tasks=running_tasks,
|
||||
completed_tasks=completed_tasks,
|
||||
failed_tasks=failed_tasks)
|
||||
|
||||
@tasks_bp.route('/api/tasks', methods=['GET'])
|
||||
def get_tasks():
|
||||
"""
|
||||
|
@ -17,10 +39,10 @@ def get_tasks():
|
|||
"""
|
||||
status = request.args.get('status')
|
||||
tasks = task_manager.get_all_tasks()
|
||||
|
||||
|
||||
if status:
|
||||
tasks = [task for task in tasks if task.status.value == status]
|
||||
|
||||
|
||||
return jsonify({
|
||||
'tasks': [task.to_dict() for task in tasks]
|
||||
})
|
||||
|
@ -31,10 +53,10 @@ def get_task(task_id):
|
|||
Get a specific task by ID.
|
||||
"""
|
||||
task = task_manager.get_task(task_id)
|
||||
|
||||
|
||||
if not task:
|
||||
return jsonify({'error': 'Task not found'}), 404
|
||||
|
||||
|
||||
return jsonify(task.to_dict())
|
||||
|
||||
@tasks_bp.route('/api/tasks/clean', methods=['POST'])
|
||||
|
@ -44,8 +66,8 @@ def clean_tasks():
|
|||
"""
|
||||
max_age = request.json.get('max_age_seconds', 3600) if request.json else 3600
|
||||
count = task_manager.clean_old_tasks(max_age)
|
||||
|
||||
|
||||
return jsonify({
|
||||
'message': f'Cleaned up {count} old tasks',
|
||||
'count': count
|
||||
})
|
||||
})
|
||||
|
|
191
application.py
191
application.py
|
@ -4,6 +4,7 @@ Flask application factory for Podcastrr.
|
|||
import os
|
||||
from flask import Flask
|
||||
from flask_migrate import Migrate
|
||||
import jinja2
|
||||
|
||||
def create_app(config=None):
|
||||
"""
|
||||
|
@ -22,7 +23,7 @@ def create_app(config=None):
|
|||
# Load default configuration
|
||||
app.config.from_mapping(
|
||||
SECRET_KEY=os.environ.get('SECRET_KEY', 'dev'),
|
||||
SQLALCHEMY_DATABASE_URI=os.environ.get('DATABASE_URI', 'sqlite:///podcastrr.db'),
|
||||
SQLALCHEMY_DATABASE_URI=os.environ.get('DATABASE_URI', f'sqlite:///{os.path.abspath(os.path.join(os.path.dirname(__file__), "instance", "podcastrr.db"))}'),
|
||||
SQLALCHEMY_TRACK_MODIFICATIONS=False,
|
||||
DOWNLOAD_PATH=os.environ.get('DOWNLOAD_PATH', os.path.join(os.getcwd(), 'downloads')),
|
||||
)
|
||||
|
@ -52,16 +53,190 @@ def create_app(config=None):
|
|||
# Ensure the download directory exists
|
||||
os.makedirs(app.config['DOWNLOAD_PATH'], exist_ok=True)
|
||||
|
||||
# Run database migrations
|
||||
# Ensure the instance directory exists
|
||||
instance_path = os.path.join(os.path.dirname(__file__), 'instance')
|
||||
os.makedirs(instance_path, exist_ok=True)
|
||||
|
||||
# Create an empty file in the instance directory to ensure it's writable
|
||||
try:
|
||||
test_file_path = os.path.join(instance_path, '.test_write')
|
||||
with open(test_file_path, 'w') as f:
|
||||
f.write('test')
|
||||
os.remove(test_file_path)
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error writing to instance directory: {str(e)}")
|
||||
app.logger.error("This may indicate a permissions issue with the instance directory.")
|
||||
|
||||
# Add custom Jinja2 filters
|
||||
@app.template_filter('isdigit')
|
||||
def isdigit_filter(s):
|
||||
"""Check if a string contains only digits."""
|
||||
if s is None:
|
||||
return False
|
||||
return str(s).isdigit()
|
||||
|
||||
# Add a context processor to make the fallback warning available to all templates
|
||||
@app.context_processor
|
||||
def inject_db_fallback_warning():
|
||||
"""Inject the database fallback warning into all templates."""
|
||||
return {
|
||||
'db_fallback_warning': app.config.get('DB_FALLBACK_WARNING', False)
|
||||
}
|
||||
|
||||
# Create database tables and run migrations
|
||||
with app.app_context():
|
||||
try:
|
||||
from migrations.add_season_explicit_naming_format import run_migration
|
||||
run_migration()
|
||||
# Get the database path from the config
|
||||
db_uri = app.config['SQLALCHEMY_DATABASE_URI']
|
||||
app.logger.info(f"Database URI: {db_uri}")
|
||||
|
||||
# Run migration to add episode_ordering column
|
||||
from migrations.add_episode_ordering import run_migration as run_episode_ordering_migration
|
||||
run_episode_ordering_migration()
|
||||
# Extract the file path from the URI
|
||||
if db_uri.startswith('sqlite:///'):
|
||||
db_path = db_uri.replace('sqlite:///', '')
|
||||
app.logger.info(f"Database path: {db_path}")
|
||||
|
||||
# Ensure the directory for the database file exists
|
||||
db_dir = os.path.dirname(db_path)
|
||||
if db_dir: # Only create directory if path has a directory component
|
||||
os.makedirs(db_dir, exist_ok=True)
|
||||
app.logger.info(f"Created database directory: {db_dir}")
|
||||
|
||||
# Test if we can write to the database directory
|
||||
try:
|
||||
test_file_path = os.path.join(db_dir if db_dir else '.', '.test_db_write')
|
||||
with open(test_file_path, 'w') as f:
|
||||
f.write('test')
|
||||
os.remove(test_file_path)
|
||||
app.logger.info("Database directory is writable")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error writing to database directory: {str(e)}")
|
||||
app.logger.error("This may indicate a permissions issue with the database directory.")
|
||||
|
||||
# Check if the database file exists
|
||||
if os.path.exists(db_path):
|
||||
app.logger.info(f"Database file exists: {db_path}")
|
||||
|
||||
# Check if the database file is readable and writable
|
||||
try:
|
||||
# Check if readable
|
||||
with open(db_path, 'r') as f:
|
||||
pass
|
||||
app.logger.info("Database file is readable")
|
||||
|
||||
# Check if writable
|
||||
with open(db_path, 'a') as f:
|
||||
pass
|
||||
app.logger.info("Database file is writable")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error accessing database file: {str(e)}")
|
||||
app.logger.error("This may indicate a permissions issue with the database file.")
|
||||
|
||||
# Try to fix permissions
|
||||
try:
|
||||
import stat
|
||||
os.chmod(db_path, stat.S_IRUSR | stat.S_IWUSR)
|
||||
app.logger.info("Updated database file permissions")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error updating database file permissions: {str(e)}")
|
||||
else:
|
||||
app.logger.info(f"Database file does not exist: {db_path}")
|
||||
|
||||
# Try to create the database file directly
|
||||
try:
|
||||
with open(db_path, 'a') as f:
|
||||
pass
|
||||
app.logger.info(f"Created empty database file: {db_path}")
|
||||
|
||||
# Set appropriate permissions
|
||||
try:
|
||||
import stat
|
||||
os.chmod(db_path, stat.S_IRUSR | stat.S_IWUSR)
|
||||
app.logger.info("Set database file permissions")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error setting database file permissions: {str(e)}")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error creating database file: {str(e)}")
|
||||
app.logger.error("This may indicate a permissions issue with the database file.")
|
||||
else:
|
||||
app.logger.info("Using a non-SQLite database, skipping file checks")
|
||||
|
||||
# Create all tables first
|
||||
app.logger.info("Creating database tables...")
|
||||
|
||||
# Try to create the database tables with retries
|
||||
max_retries = 3
|
||||
retry_count = 0
|
||||
fallback_used = False
|
||||
|
||||
while retry_count < max_retries:
|
||||
try:
|
||||
# Test the database connection first
|
||||
app.logger.info("Testing database connection...")
|
||||
with db.engine.connect() as conn:
|
||||
app.logger.info("Database connection successful")
|
||||
|
||||
# Create the tables
|
||||
db.create_all()
|
||||
app.logger.info("Database tables created successfully")
|
||||
break
|
||||
except Exception as e:
|
||||
retry_count += 1
|
||||
app.logger.error(f"Error creating database tables (attempt {retry_count}/{max_retries}): {str(e)}")
|
||||
|
||||
# If we've tried multiple times and still failing, try a fallback approach
|
||||
if retry_count >= max_retries and not fallback_used:
|
||||
app.logger.warning("Maximum retry attempts reached. Trying fallback approach...")
|
||||
|
||||
try:
|
||||
# Create a fallback database in memory
|
||||
app.logger.info("Attempting to use in-memory SQLite database as fallback")
|
||||
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///:memory:'
|
||||
|
||||
# Reinitialize the database with the new connection string
|
||||
db.init_app(app)
|
||||
|
||||
# Reset retry counter and set fallback flag
|
||||
retry_count = 0
|
||||
fallback_used = True
|
||||
|
||||
app.logger.info("Fallback database configured, retrying...")
|
||||
|
||||
# Add a warning message that will be displayed in the application
|
||||
app.config['DB_FALLBACK_WARNING'] = True
|
||||
app.logger.warning("WARNING: Using in-memory database as fallback. Data will not be persisted between application restarts!")
|
||||
|
||||
continue
|
||||
except Exception as fallback_error:
|
||||
app.logger.error(f"Error setting up fallback database: {str(fallback_error)}")
|
||||
|
||||
if retry_count >= max_retries:
|
||||
if fallback_used:
|
||||
app.logger.error("Maximum retry attempts reached with fallback. Could not create database tables.")
|
||||
else:
|
||||
app.logger.error("Maximum retry attempts reached. Could not create database tables.")
|
||||
raise
|
||||
|
||||
import time
|
||||
time.sleep(1) # Wait a second before retrying
|
||||
|
||||
# Then run all migration scripts
|
||||
import importlib
|
||||
|
||||
migrations_dir = os.path.join(os.path.dirname(__file__), 'migrations')
|
||||
for filename in os.listdir(migrations_dir):
|
||||
if filename.endswith('.py') and filename != '__init__.py':
|
||||
module_name = f"migrations.{filename[:-3]}"
|
||||
try:
|
||||
migration_module = importlib.import_module(module_name)
|
||||
if hasattr(migration_module, 'run_migration'):
|
||||
app.logger.info(f"Running migration: {filename}")
|
||||
migration_module.run_migration()
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error running migration {filename}: {str(e)}")
|
||||
except Exception as e:
|
||||
app.logger.error(f"Error running migration: {str(e)}")
|
||||
app.logger.error(f"Error during database initialization: {str(e)}")
|
||||
# Log more detailed error information
|
||||
import traceback
|
||||
app.logger.error(traceback.format_exc())
|
||||
|
||||
return app
|
||||
|
|
24
init_db.py
24
init_db.py
|
@ -1,13 +1,15 @@
|
|||
from app import create_app
|
||||
from application import create_app
|
||||
from app.models.database import db
|
||||
from app.models.settings import Settings
|
||||
import importlib
|
||||
import os
|
||||
|
||||
app = create_app()
|
||||
|
||||
with app.app_context():
|
||||
# Create all tables
|
||||
db.create_all()
|
||||
|
||||
|
||||
# Check if settings exist, create default if not
|
||||
if not Settings.query.first():
|
||||
default_settings = Settings(
|
||||
|
@ -20,5 +22,19 @@ with app.app_context():
|
|||
db.session.add(default_settings)
|
||||
db.session.commit()
|
||||
print("Created default settings")
|
||||
|
||||
print("Database initialized successfully!")
|
||||
|
||||
# Run all migration scripts
|
||||
print("Running migrations...")
|
||||
migrations_dir = os.path.join(os.path.dirname(__file__), 'migrations')
|
||||
for filename in os.listdir(migrations_dir):
|
||||
if filename.endswith('.py') and filename != '__init__.py':
|
||||
module_name = f"migrations.{filename[:-3]}"
|
||||
try:
|
||||
migration_module = importlib.import_module(module_name)
|
||||
if hasattr(migration_module, 'run_migration'):
|
||||
print(f"Running migration: {filename}")
|
||||
migration_module.run_migration()
|
||||
except Exception as e:
|
||||
print(f"Error running migration {filename}: {str(e)}")
|
||||
|
||||
print("Database initialized successfully!")
|
||||
|
|
6
main.py
6
main.py
|
@ -25,10 +25,8 @@ def main():
|
|||
# Create the Flask app
|
||||
app = create_app()
|
||||
|
||||
# Create database tables if they don't exist
|
||||
with app.app_context():
|
||||
db.create_all()
|
||||
print("Database tables created successfully!")
|
||||
# Database tables are created in application.py
|
||||
print("Database tables created successfully!")
|
||||
|
||||
# Get port from environment variable or use default
|
||||
port = int(os.environ.get("PORT", 5000))
|
||||
|
|
33
migrations/add_podcast_tags.py
Normal file
33
migrations/add_podcast_tags.py
Normal file
|
@ -0,0 +1,33 @@
|
|||
"""
|
||||
Migration script to add tags field to the podcasts table.
|
||||
"""
|
||||
import sqlite3
|
||||
import os
|
||||
from flask import current_app
|
||||
|
||||
def run_migration():
|
||||
"""
|
||||
Run the migration to add the tags field to the podcasts table.
|
||||
"""
|
||||
# Get the database path from the app config
|
||||
db_path = current_app.config['SQLALCHEMY_DATABASE_URI'].replace('sqlite:///', '')
|
||||
|
||||
# Connect to the database
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if the tags column already exists in the podcasts table
|
||||
cursor.execute("PRAGMA table_info(podcasts)")
|
||||
columns = [column[1] for column in cursor.fetchall()]
|
||||
|
||||
# Add the tags column if it doesn't exist
|
||||
if 'tags' not in columns:
|
||||
print("Adding 'tags' column to podcasts table...")
|
||||
cursor.execute("ALTER TABLE podcasts ADD COLUMN tags TEXT")
|
||||
|
||||
# Commit the changes and close the connection
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
print("Podcast tags migration completed successfully!")
|
||||
return True
|
|
@ -1,26 +1,26 @@
|
|||
# Core dependencies
|
||||
Flask==3.1.1
|
||||
SQLAlchemy==2.0.27
|
||||
alembic==1.7.3
|
||||
requests==2.32.4
|
||||
beautifulsoup4==4.10.0
|
||||
feedparser==6.0.8
|
||||
python-dotenv==0.19.0
|
||||
Flask>=3.1.1
|
||||
SQLAlchemy>=2.0.27
|
||||
alembic>=1.7.3
|
||||
requests>=2.32.4
|
||||
beautifulsoup4>=4.10.0
|
||||
feedparser>=6.0.8
|
||||
python-dotenv>=0.19.0
|
||||
|
||||
# Web interface
|
||||
Flask-SQLAlchemy==3.1.0
|
||||
Flask-WTF==0.15.1
|
||||
Flask-Login==0.5.0
|
||||
Flask-Migrate==3.1.0
|
||||
Flask-SQLAlchemy>=3.1.0
|
||||
Flask-WTF>=0.15.1
|
||||
Flask-Login>=0.5.0
|
||||
Flask-Migrate>=3.1.0
|
||||
|
||||
# API
|
||||
Flask-RESTful==0.3.9
|
||||
marshmallow==3.13.0
|
||||
Flask-RESTful>=0.3.9
|
||||
marshmallow>=3.13.0
|
||||
|
||||
# Testing
|
||||
pytest==6.2.5
|
||||
pytest-cov==2.12.1
|
||||
pytest>=6.2.5
|
||||
pytest-cov>=2.12.1
|
||||
|
||||
# Development
|
||||
black==24.3.0
|
||||
flake8==3.9.2
|
||||
black>=24.3.0
|
||||
flake8>=3.9.2
|
||||
|
|
35
run_migrations.py
Normal file
35
run_migrations.py
Normal file
|
@ -0,0 +1,35 @@
|
|||
"""
|
||||
Script to run all database migrations for Podcastrr.
|
||||
This script is useful when you need to apply migrations to an existing database.
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
|
||||
def main():
|
||||
"""
|
||||
Run the init_db.py script to apply all migrations.
|
||||
"""
|
||||
print("Running database migrations...")
|
||||
|
||||
# Get the path to the init_db.py script
|
||||
init_db_path = os.path.join(os.path.dirname(__file__), 'init_db.py')
|
||||
|
||||
# Run the init_db.py script
|
||||
try:
|
||||
result = subprocess.run([sys.executable, init_db_path],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True)
|
||||
print(result.stdout)
|
||||
print("Migrations completed successfully!")
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"Error running migrations: {e}")
|
||||
print(f"Output: {e.stdout}")
|
||||
print(f"Error: {e.stderr}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main())
|
|
@ -208,6 +208,42 @@ html, body {
|
|||
background-color: #0d1117;
|
||||
}
|
||||
|
||||
/* Stats Grid and Cards */
|
||||
.stats-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fill, minmax(200px, 1fr));
|
||||
gap: 16px;
|
||||
padding: 16px;
|
||||
}
|
||||
|
||||
.stat-card {
|
||||
background-color: #161b22;
|
||||
border: 1px solid #30363d;
|
||||
border-radius: 6px;
|
||||
padding: 16px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.stat-card h3 {
|
||||
font-size: 14px;
|
||||
font-weight: 600;
|
||||
color: #f0f6fc;
|
||||
margin-bottom: 8px;
|
||||
}
|
||||
|
||||
.stat-value {
|
||||
font-size: 24px;
|
||||
font-weight: 700;
|
||||
color: #58a6ff;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.stat-subtitle {
|
||||
font-size: 11px;
|
||||
color: #7d8590;
|
||||
margin-top: 4px;
|
||||
}
|
||||
|
||||
/* Data Table */
|
||||
.data-table {
|
||||
width: 100%;
|
||||
|
@ -566,6 +602,7 @@ html, body {
|
|||
border: 1px solid #30363d;
|
||||
border-radius: 6px;
|
||||
overflow: hidden;
|
||||
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
.season-header {
|
||||
|
@ -576,12 +613,19 @@ html, body {
|
|||
background-color: #161b22;
|
||||
cursor: pointer;
|
||||
user-select: none;
|
||||
transition: background-color 0.2s ease;
|
||||
}
|
||||
|
||||
.season-header:hover {
|
||||
background-color: #1f2937;
|
||||
}
|
||||
|
||||
.season-header h3 {
|
||||
margin: 0;
|
||||
font-size: 16px;
|
||||
color: #f0f6fc;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.episode-count {
|
||||
|
@ -592,13 +636,15 @@ html, body {
|
|||
}
|
||||
|
||||
.toggle-icon {
|
||||
font-size: 12px;
|
||||
color: #7d8590;
|
||||
font-size: 14px;
|
||||
color: #58a6ff;
|
||||
transition: transform 0.2s ease;
|
||||
}
|
||||
|
||||
.season-content {
|
||||
display: none;
|
||||
background-color: #0d1117;
|
||||
border-top: 1px solid #30363d;
|
||||
}
|
||||
|
||||
/* Explicit Badge */
|
||||
|
@ -635,3 +681,125 @@ html, body {
|
|||
::-webkit-scrollbar-thumb:hover {
|
||||
background: #484f58;
|
||||
}
|
||||
|
||||
/* Task Page Styles */
|
||||
.section {
|
||||
margin-bottom: 24px;
|
||||
padding: 0 16px;
|
||||
}
|
||||
|
||||
.section-title {
|
||||
font-size: 16px;
|
||||
font-weight: 600;
|
||||
color: #f0f6fc;
|
||||
margin: 16px 0;
|
||||
padding-bottom: 8px;
|
||||
border-bottom: 1px solid #30363d;
|
||||
}
|
||||
|
||||
.task-list {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 12px;
|
||||
}
|
||||
|
||||
.task-card {
|
||||
background-color: #161b22;
|
||||
border: 1px solid #30363d;
|
||||
border-radius: 6px;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.task-card.task-failed {
|
||||
border-left: 3px solid #f85149;
|
||||
}
|
||||
|
||||
.task-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: 12px 16px;
|
||||
background-color: #21262d;
|
||||
border-bottom: 1px solid #30363d;
|
||||
}
|
||||
|
||||
.task-title {
|
||||
font-size: 14px;
|
||||
font-weight: 600;
|
||||
color: #f0f6fc;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.task-status {
|
||||
font-size: 11px;
|
||||
font-weight: 600;
|
||||
padding: 2px 6px;
|
||||
border-radius: 3px;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.status-running {
|
||||
background-color: rgba(46, 160, 67, 0.15);
|
||||
color: #3fb950;
|
||||
border: 1px solid rgba(46, 160, 67, 0.4);
|
||||
}
|
||||
|
||||
.status-pending {
|
||||
background-color: rgba(187, 128, 9, 0.15);
|
||||
color: #d29922;
|
||||
border: 1px solid rgba(187, 128, 9, 0.4);
|
||||
}
|
||||
|
||||
.status-completed {
|
||||
background-color: rgba(46, 160, 67, 0.15);
|
||||
color: #3fb950;
|
||||
border: 1px solid rgba(46, 160, 67, 0.4);
|
||||
}
|
||||
|
||||
.status-failed {
|
||||
background-color: rgba(248, 81, 73, 0.15);
|
||||
color: #f85149;
|
||||
border: 1px solid rgba(248, 81, 73, 0.4);
|
||||
}
|
||||
|
||||
.task-details {
|
||||
padding: 12px 16px;
|
||||
}
|
||||
|
||||
.task-message {
|
||||
font-size: 12px;
|
||||
color: #c9d1d9;
|
||||
margin-bottom: 8px;
|
||||
}
|
||||
|
||||
.task-error {
|
||||
font-size: 12px;
|
||||
color: #f85149;
|
||||
margin-bottom: 8px;
|
||||
padding: 8px;
|
||||
background-color: rgba(248, 81, 73, 0.1);
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.task-meta {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 12px;
|
||||
margin-top: 12px;
|
||||
font-size: 11px;
|
||||
color: #7d8590;
|
||||
}
|
||||
|
||||
.progress-bar {
|
||||
height: 4px;
|
||||
background-color: #30363d;
|
||||
border-radius: 2px;
|
||||
overflow: hidden;
|
||||
margin: 8px 0;
|
||||
}
|
||||
|
||||
.progress-fill {
|
||||
height: 100%;
|
||||
background-color: #3fb950;
|
||||
width: 0;
|
||||
}
|
||||
|
|
|
@ -46,6 +46,12 @@
|
|||
Settings
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="{{ url_for('tasks.view_tasks') }}"
|
||||
class="{% if request.endpoint == 'tasks.view_tasks' %}active{% endif %}">
|
||||
Tasks
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<!-- Task Status Area -->
|
||||
|
@ -84,6 +90,17 @@
|
|||
{% endif %}
|
||||
{% endwith %}
|
||||
|
||||
<!-- Database Fallback Warning -->
|
||||
{% if db_fallback_warning %}
|
||||
<div class="flash-messages">
|
||||
<div class="flash-message error" style="background-color: #f85149; color: white; font-weight: bold;">
|
||||
WARNING: Using in-memory database as fallback. Data will not be persisted between application restarts!
|
||||
<br>
|
||||
<span style="font-size: 0.9em;">Please check the application logs for details on how to fix this issue.</span>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% block content %}{% endblock %}
|
||||
</div>
|
||||
</div>
|
||||
|
|
|
@ -25,17 +25,18 @@
|
|||
|
||||
<div class="stat-card">
|
||||
<h3>Episodes</h3>
|
||||
<p class="stat-value">0</p>
|
||||
<p class="stat-value">{{ not_downloaded_episodes }} / {{ downloaded_episodes }} / {{ total_episodes }}</p>
|
||||
<p class="stat-subtitle">Not Downloaded / Downloaded / Total</p>
|
||||
</div>
|
||||
|
||||
<div class="stat-card">
|
||||
<h3>Downloads</h3>
|
||||
<p class="stat-value">0</p>
|
||||
<p class="stat-value">{{ downloaded_episodes }}</p>
|
||||
</div>
|
||||
|
||||
<div class="stat-card">
|
||||
<h3>Storage</h3>
|
||||
<p class="stat-value">0 GB</p>
|
||||
<p class="stat-value">{{ formatted_storage }}</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
|
38
templates/podcasts/import_opml.html
Normal file
38
templates/podcasts/import_opml.html
Normal file
|
@ -0,0 +1,38 @@
|
|||
{% extends "base.html" %}
|
||||
|
||||
{% block content %}
|
||||
<div class="container">
|
||||
<h1>Import Podcasts from OPML</h1>
|
||||
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<p>Upload an OPML file to import podcasts. OPML files are commonly used to export podcast subscriptions from other podcast applications.</p>
|
||||
|
||||
<form action="{{ url_for('podcasts.import_opml') }}" method="post" enctype="multipart/form-data">
|
||||
<div class="form-group">
|
||||
<label for="opml_file">OPML File</label>
|
||||
<input type="file" class="form-control-file" id="opml_file" name="opml_file" required>
|
||||
<small class="form-text text-muted">Select an OPML file (.opml or .xml)</small>
|
||||
</div>
|
||||
|
||||
<button type="submit" class="btn btn-primary">Import</button>
|
||||
<a href="{{ url_for('podcasts.index') }}" class="btn btn-secondary">Cancel</a>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mt-4">
|
||||
<h2>What is OPML?</h2>
|
||||
<p>OPML (Outline Processor Markup Language) is a format commonly used to exchange lists of RSS feeds between applications. Most podcast applications allow you to export your subscriptions as an OPML file, which you can then import into Podcastrr.</p>
|
||||
|
||||
<h2>How to Export OPML from Other Applications</h2>
|
||||
<ul>
|
||||
<li><strong>Apple Podcasts:</strong> Go to File > Library > Export Library</li>
|
||||
<li><strong>Pocket Casts:</strong> Go to Profile > Settings > Export OPML</li>
|
||||
<li><strong>Spotify:</strong> Spotify doesn't support OPML export directly</li>
|
||||
<li><strong>Google Podcasts:</strong> Go to Settings > Export subscriptions</li>
|
||||
<li><strong>Overcast:</strong> Go to Settings > Export OPML</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
|
@ -20,6 +20,29 @@
|
|||
</form>
|
||||
</div>
|
||||
|
||||
<!-- Add by RSS URL Form -->
|
||||
<div class="form-container mt-4">
|
||||
<h3>Add by RSS Feed URL</h3>
|
||||
<form action="{{ url_for('podcasts.add_by_url') }}" method="post">
|
||||
<div class="form-group">
|
||||
<label for="feed_url">Podcast RSS Feed URL:</label>
|
||||
<input type="url" id="feed_url" name="feed_url"
|
||||
placeholder="https://example.com/podcast/feed.xml" required>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-primary">Add Podcast</button>
|
||||
</form>
|
||||
</div>
|
||||
|
||||
<!-- OPML Import/Export -->
|
||||
<div class="form-container mt-4">
|
||||
<h3>Import/Export Podcasts</h3>
|
||||
<p>Import podcasts from an OPML file or export your current podcasts to an OPML file.</p>
|
||||
<div class="btn-group">
|
||||
<a href="{{ url_for('podcasts.import_opml') }}" class="btn btn-primary">Import OPML</a>
|
||||
<a href="{{ url_for('podcasts.export_opml') }}" class="btn btn-secondary">Export OPML</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Search Results -->
|
||||
{% if results %}
|
||||
<div class="content-area">
|
||||
|
|
96
templates/podcasts/tags_modal.html
Normal file
96
templates/podcasts/tags_modal.html
Normal file
|
@ -0,0 +1,96 @@
|
|||
<!-- Tags Modal -->
|
||||
<div id="tags-modal" class="modal">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h2>Manage Tags for {{ podcast.title }}</h2>
|
||||
<span class="close" onclick="document.getElementById('tags-modal').style.display='none'">×</span>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form action="{{ url_for('podcasts.update_tags', podcast_id=podcast.id) }}" method="post">
|
||||
<div class="form-group">
|
||||
<label for="tags">Tags (comma-separated):</label>
|
||||
<input type="text" id="tags" name="tags" class="form-control"
|
||||
value="{{ podcast.tags }}"
|
||||
placeholder="news, technology, comedy, etc.">
|
||||
<small class="form-text text-muted">Enter tags separated by commas. Tags help you organize and filter your podcasts.</small>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<h4>Common Tags</h4>
|
||||
<div class="tag-suggestions">
|
||||
<span class="tag-suggestion" onclick="addTag('news')">news</span>
|
||||
<span class="tag-suggestion" onclick="addTag('technology')">technology</span>
|
||||
<span class="tag-suggestion" onclick="addTag('comedy')">comedy</span>
|
||||
<span class="tag-suggestion" onclick="addTag('business')">business</span>
|
||||
<span class="tag-suggestion" onclick="addTag('politics')">politics</span>
|
||||
<span class="tag-suggestion" onclick="addTag('education')">education</span>
|
||||
<span class="tag-suggestion" onclick="addTag('entertainment')">entertainment</span>
|
||||
<span class="tag-suggestion" onclick="addTag('health')">health</span>
|
||||
<span class="tag-suggestion" onclick="addTag('science')">science</span>
|
||||
<span class="tag-suggestion" onclick="addTag('sports')">sports</span>
|
||||
<span class="tag-suggestion" onclick="addTag('arts')">arts</span>
|
||||
<span class="tag-suggestion" onclick="addTag('music')">music</span>
|
||||
<span class="tag-suggestion" onclick="addTag('favorites')">favorites</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="modal-footer">
|
||||
<button type="submit" class="btn btn-primary">Save Tags</button>
|
||||
<button type="button" class="btn btn-secondary" onclick="document.getElementById('tags-modal').style.display='none'">Cancel</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function addTag(tag) {
|
||||
const tagsInput = document.getElementById('tags');
|
||||
const currentTags = tagsInput.value.split(',').map(t => t.trim()).filter(t => t);
|
||||
|
||||
// Add the tag if it's not already in the list
|
||||
if (!currentTags.includes(tag)) {
|
||||
currentTags.push(tag);
|
||||
tagsInput.value = currentTags.join(', ');
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
.tag-badge {
|
||||
display: inline-block;
|
||||
padding: 2px 6px;
|
||||
margin-right: 4px;
|
||||
background-color: #1f6feb;
|
||||
color: #ffffff;
|
||||
border-radius: 12px;
|
||||
font-size: 10px;
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.tag-badge:hover {
|
||||
background-color: #388bfd;
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.tag-suggestions {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 8px;
|
||||
margin-top: 8px;
|
||||
}
|
||||
|
||||
.tag-suggestion {
|
||||
display: inline-block;
|
||||
padding: 4px 8px;
|
||||
background-color: #21262d;
|
||||
color: #c9d1d9;
|
||||
border-radius: 12px;
|
||||
font-size: 12px;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.tag-suggestion:hover {
|
||||
background-color: #30363d;
|
||||
}
|
||||
</style>
|
|
@ -10,6 +10,9 @@
|
|||
<form action="{{ url_for('podcasts.update', podcast_id=podcast.id) }}" method="post" style="display: inline;">
|
||||
<button type="submit" class="btn btn-primary">Update Episodes</button>
|
||||
</form>
|
||||
<form action="{{ url_for('podcasts.download_all', podcast_id=podcast.id) }}" method="post" style="display: inline; margin-left: 8px;">
|
||||
<button type="submit" class="btn btn-success">Download All Episodes</button>
|
||||
</form>
|
||||
<form action="{{ url_for('podcasts.delete', podcast_id=podcast.id) }}" method="post"
|
||||
style="display: inline; margin-left: 8px;"
|
||||
onsubmit="return confirm('Are you sure you want to delete this podcast?');">
|
||||
|
@ -47,7 +50,17 @@
|
|||
<a href="{{ podcast.feed_url }}" target="_blank" style="color: #58a6ff;">View RSS Feed</a>
|
||||
{% endif %}
|
||||
<a href="#" onclick="document.getElementById('naming-format-modal').style.display='block'; return false;" style="color: #58a6ff;">Configure Naming Format</a>
|
||||
<a href="#" onclick="document.getElementById('tags-modal').style.display='block'; return false;" style="color: #58a6ff;">Manage Tags</a>
|
||||
</div>
|
||||
|
||||
{% if podcast.tags %}
|
||||
<div style="margin-top: 8px;">
|
||||
<span style="font-size: 11px; color: #7d8590;">Tags: </span>
|
||||
{% for tag in podcast.get_tags() %}
|
||||
<a href="{{ url_for('podcasts.filter_by_tag', tag=tag) }}" class="tag-badge">{{ tag }}</a>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
@ -55,8 +68,8 @@
|
|||
<!-- Toolbar -->
|
||||
<div class="toolbar">
|
||||
<span class="toolbar-btn">{{ episodes|length }} Episodes</span>
|
||||
<div style="margin-left: auto;">
|
||||
<form action="{{ url_for('podcasts.verify', podcast_id=podcast.id) }}" method="post" style="display: inline;">
|
||||
<div style="margin-left: auto; display: flex; gap: 8px;">
|
||||
<form action="{{ url_for('podcasts.verify', podcast_id=podcast.id) }}" method="post" style="display: inline-block;">
|
||||
<button type="submit" class="toolbar-btn">Verify Files</button>
|
||||
</form>
|
||||
<button class="toolbar-btn" onclick="window.location.reload()">Refresh</button>
|
||||
|
@ -67,40 +80,62 @@
|
|||
<!-- Episodes Table -->
|
||||
<div class="content-area">
|
||||
{% if episodes %}
|
||||
{# Check if any episodes have season information #}
|
||||
{% set has_seasons = false %}
|
||||
{# Group episodes by season or year if season is not available #}
|
||||
{% set seasons = {} %}
|
||||
{% set season_ids = {} %}
|
||||
{% set season_download_counts = {} %}
|
||||
{% set season_counter = 0 %}
|
||||
|
||||
{% for episode in episodes %}
|
||||
{% if episode.season and not has_seasons %}
|
||||
{% set has_seasons = true %}
|
||||
{% set season_key = "" %}
|
||||
{% if episode.season %}
|
||||
{# Use season number if available #}
|
||||
{% set season_key = "Season " ~ episode.season %}
|
||||
{% elif episode.published_date %}
|
||||
{# Use year as season if no season number but published date is available #}
|
||||
{% set season_key = episode.published_date.strftime('%Y') %}
|
||||
{% else %}
|
||||
{# Fallback for episodes with no season or published date #}
|
||||
{% set season_key = "Unsorted Episodes" %}
|
||||
{% endif %}
|
||||
|
||||
{# Initialize season if not exists #}
|
||||
{% if season_key not in seasons %}
|
||||
{% set season_counter = season_counter + 1 %}
|
||||
{% set _ = seasons.update({season_key: []}) %}
|
||||
{% set _ = season_ids.update({season_key: season_counter}) %}
|
||||
{% set _ = season_download_counts.update({season_key: {'downloaded': 0, 'total': 0}}) %}
|
||||
{% endif %}
|
||||
|
||||
{# Add episode to season #}
|
||||
{% set _ = seasons[season_key].append(episode) %}
|
||||
|
||||
{# Update download counts #}
|
||||
{% if episode.downloaded %}
|
||||
{% set downloaded = season_download_counts[season_key]['downloaded'] + 1 %}
|
||||
{% set total = season_download_counts[season_key]['total'] + 1 %}
|
||||
{% else %}
|
||||
{% set downloaded = season_download_counts[season_key]['downloaded'] %}
|
||||
{% set total = season_download_counts[season_key]['total'] + 1 %}
|
||||
{% endif %}
|
||||
{% set _ = season_download_counts.update({season_key: {'downloaded': downloaded, 'total': total}}) %}
|
||||
{% endfor %}
|
||||
|
||||
{% if has_seasons %}
|
||||
{# Group episodes by season #}
|
||||
{% set seasons = {} %}
|
||||
{% for episode in episodes %}
|
||||
{% set season_num = episode.season|default(0) %}
|
||||
{% if season_num not in seasons %}
|
||||
{% set seasons = seasons|merge({season_num: []}) %}
|
||||
{% endif %}
|
||||
{% set _ = seasons[season_num].append(episode) %}
|
||||
{% endfor %}
|
||||
{# Display seasons in reverse order (newest first) #}
|
||||
{% if seasons %}
|
||||
{% for season_key, episodes_list in seasons|dictsort|reverse %}
|
||||
{% set season_id = season_ids[season_key] %}
|
||||
{% set download_stats = season_download_counts[season_key] %}
|
||||
|
||||
{# Display seasons in order #}
|
||||
{% for season_num in seasons|sort %}
|
||||
<div class="season-accordion">
|
||||
<div class="season-header" onclick="toggleSeason('{{ season_num }}')">
|
||||
<div class="season-header" onclick="toggleSeason()">
|
||||
<h3>
|
||||
{% if season_num == 0 %}
|
||||
Unsorted Episodes
|
||||
{% else %}
|
||||
Season {{ season_num }}
|
||||
{% endif %}
|
||||
<span class="episode-count">({{ seasons[season_num]|length }} episodes)</span>
|
||||
{{ season_key }}
|
||||
<span class="episode-count">({{ download_stats['downloaded'] }}/{{ download_stats['total'] }} episodes)</span>
|
||||
</h3>
|
||||
<span id="toggle-icon-{{ season_num }}" class="toggle-icon">▼</span>
|
||||
<span id="toggle-icon-season_{{ season_id }}" class="toggle-icon">▼</span>
|
||||
</div>
|
||||
<div id="season-{{ season_num }}" class="season-content">
|
||||
<div id="season-season_{{ season_id }}" class="season-content">
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
|
@ -112,16 +147,16 @@
|
|||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for episode in seasons[season_num]|sort(attribute='episode_number') %}
|
||||
{% for episode in episodes_list|sort(attribute='published_date', reverse=true) %}
|
||||
<tr>
|
||||
<td>
|
||||
<div class="cell-title">
|
||||
{% if episode.episode_number %}
|
||||
<span style="color: #58a6ff; font-weight: bold; margin-right: 5px;">
|
||||
{% if episode.season %}
|
||||
S{{ episode.season }}E{{ episode.episode_number }}
|
||||
S{{ '%02d' % episode.season }}E{{ '%02d' % episode.episode_number|int if episode.episode_number|string|isdigit() else episode.episode_number }}
|
||||
{% else %}
|
||||
#{{ episode.episode_number }}
|
||||
#{{ '%02d' % episode.episode_number|int if episode.episode_number|string|isdigit() else episode.episode_number }}
|
||||
{% endif %}
|
||||
</span>
|
||||
{% endif %}
|
||||
|
@ -194,7 +229,7 @@
|
|||
<td>
|
||||
<div class="cell-title">
|
||||
{% if episode.episode_number %}
|
||||
<span style="color: #58a6ff; font-weight: bold; margin-right: 5px;">#{{ episode.episode_number }}</span>
|
||||
<span style="color: #58a6ff; font-weight: bold; margin-right: 5px;">#{{ '%02d' % episode.episode_number|int if episode.episode_number|string|isdigit() else episode.episode_number }}</span>
|
||||
{% endif %}
|
||||
{{ episode.title }}
|
||||
{% if episode.explicit %}
|
||||
|
@ -260,27 +295,46 @@
|
|||
{% block scripts %}
|
||||
<script>
|
||||
function toggleSeason(seasonId) {
|
||||
const seasonContent = document.getElementById('season-' + seasonId);
|
||||
const toggleIcon = document.getElementById('toggle-icon-' + seasonId);
|
||||
// Find the clicked header element
|
||||
const clickedHeader = event.currentTarget;
|
||||
|
||||
// Find the content and toggle icon elements
|
||||
const seasonContent = clickedHeader.nextElementSibling;
|
||||
const toggleIcon = clickedHeader.querySelector('.toggle-icon');
|
||||
|
||||
if (seasonContent.style.display === 'block') {
|
||||
// If already open, close it
|
||||
seasonContent.style.display = 'none';
|
||||
toggleIcon.innerHTML = '▼';
|
||||
} else {
|
||||
// Close all other accordions first
|
||||
const allSeasonContents = document.querySelectorAll('.season-content');
|
||||
const allToggleIcons = document.querySelectorAll('.toggle-icon');
|
||||
|
||||
allSeasonContents.forEach(function(content) {
|
||||
content.style.display = 'none';
|
||||
});
|
||||
|
||||
allToggleIcons.forEach(function(icon) {
|
||||
icon.innerHTML = '▼';
|
||||
});
|
||||
|
||||
// Then open the clicked one
|
||||
seasonContent.style.display = 'block';
|
||||
toggleIcon.innerHTML = '▲';
|
||||
}
|
||||
}
|
||||
|
||||
// Open the first season by default when the page loads
|
||||
// Initialize all season accordions as collapsed by default
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
const firstSeasonAccordion = document.querySelector('.season-accordion');
|
||||
if (firstSeasonAccordion) {
|
||||
const seasonId = firstSeasonAccordion.querySelector('.season-content').id.replace('season-', '');
|
||||
toggleSeason(seasonId);
|
||||
}
|
||||
// Make sure all season contents have display style set to none (collapsed)
|
||||
const allSeasonContents = document.querySelectorAll('.season-content');
|
||||
allSeasonContents.forEach(function(content) {
|
||||
content.style.display = 'none';
|
||||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
{% include 'podcasts/naming_format_modal.html' %}
|
||||
{% include 'podcasts/tags_modal.html' %}
|
||||
{% endblock %}
|
||||
|
|
128
templates/tasks/index.html
Normal file
128
templates/tasks/index.html
Normal file
|
@ -0,0 +1,128 @@
|
|||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Task History{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="content-header">
|
||||
<h1 class="content-title">Task History</h1>
|
||||
<div class="content-actions">
|
||||
<button class="btn btn-sm" id="refresh-tasks">Refresh</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="content-area">
|
||||
<!-- In Progress Tasks -->
|
||||
<div class="section">
|
||||
<h2 class="section-title">In Progress Tasks</h2>
|
||||
{% if running_tasks %}
|
||||
<div class="task-list">
|
||||
{% for task in running_tasks %}
|
||||
<div class="task-card">
|
||||
<div class="task-header">
|
||||
<h3 class="task-title">{{ task.description }}</h3>
|
||||
<span class="task-status status-{{ task.status.value }}">{{ task.status.value }}</span>
|
||||
</div>
|
||||
<div class="task-details">
|
||||
<p class="task-message">{{ task.message }}</p>
|
||||
<div class="progress-bar">
|
||||
<div class="progress-fill" data-progress="{{ task.progress }}"></div>
|
||||
</div>
|
||||
<div class="task-meta">
|
||||
<span class="task-type">Type: {{ task.type }}</span>
|
||||
<span class="task-time">Started: {{ task.started_at.strftime('%Y-%m-%d %H:%M:%S') if task.started_at else 'Pending' }}</span>
|
||||
<span class="task-id">ID: {{ task.id }}</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="empty-state">
|
||||
<p>No tasks currently in progress.</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Completed Tasks -->
|
||||
<div class="section">
|
||||
<h2 class="section-title">Completed Tasks</h2>
|
||||
{% if completed_tasks %}
|
||||
<div class="task-list">
|
||||
{% for task in completed_tasks %}
|
||||
<div class="task-card">
|
||||
<div class="task-header">
|
||||
<h3 class="task-title">{{ task.description }}</h3>
|
||||
<span class="task-status status-completed">Completed</span>
|
||||
</div>
|
||||
<div class="task-details">
|
||||
<p class="task-message">{{ task.message }}</p>
|
||||
<div class="task-meta">
|
||||
<span class="task-type">Type: {{ task.type }}</span>
|
||||
<span class="task-time">Completed: {{ task.completed_at.strftime('%Y-%m-%d %H:%M:%S') if task.completed_at else 'Unknown' }}</span>
|
||||
<span class="task-duration">Duration: {{ ((task.completed_at - task.started_at).total_seconds()|int) if task.completed_at and task.started_at else 0 }} seconds</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="empty-state">
|
||||
<p>No completed tasks in history.</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Failed Tasks -->
|
||||
<div class="section">
|
||||
<h2 class="section-title">Failed Tasks</h2>
|
||||
{% if failed_tasks %}
|
||||
<div class="task-list">
|
||||
{% for task in failed_tasks %}
|
||||
<div class="task-card task-failed">
|
||||
<div class="task-header">
|
||||
<h3 class="task-title">{{ task.description }}</h3>
|
||||
<span class="task-status status-failed">Failed</span>
|
||||
</div>
|
||||
<div class="task-details">
|
||||
<p class="task-message">{{ task.message }}</p>
|
||||
<p class="task-error">{{ task.error }}</p>
|
||||
<div class="task-meta">
|
||||
<span class="task-type">Type: {{ task.type }}</span>
|
||||
<span class="task-time">Failed at: {{ task.completed_at.strftime('%Y-%m-%d %H:%M:%S') if task.completed_at else 'Unknown' }}</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="empty-state">
|
||||
<p>No failed tasks in history.</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
// Set progress bar widths based on data-progress attribute
|
||||
const progressBars = document.querySelectorAll('.progress-fill');
|
||||
progressBars.forEach(bar => {
|
||||
const progress = bar.getAttribute('data-progress');
|
||||
bar.style.width = progress + '%';
|
||||
bar.style.height = '100%';
|
||||
bar.style.backgroundColor = '#3fb950';
|
||||
});
|
||||
|
||||
// Refresh button functionality
|
||||
document.getElementById('refresh-tasks').addEventListener('click', function() {
|
||||
this.textContent = 'Refreshing...';
|
||||
this.disabled = true;
|
||||
|
||||
// Reload the page to refresh the task list
|
||||
window.location.reload();
|
||||
});
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
82
test_migration.py
Normal file
82
test_migration.py
Normal file
|
@ -0,0 +1,82 @@
|
|||
"""
|
||||
Test script to verify that the 'tags' column exists in the 'podcasts' table.
|
||||
This script can be used to test if the migration has been applied correctly.
|
||||
"""
|
||||
import os
|
||||
import sqlite3
|
||||
import sys
|
||||
|
||||
def main():
|
||||
"""
|
||||
Check if the 'tags' column exists in the 'podcasts' table.
|
||||
If not, suggest running the migration.
|
||||
"""
|
||||
print("Testing database schema...")
|
||||
|
||||
# Find the database file
|
||||
db_path = 'instance/podcastrr.db'
|
||||
if not os.path.exists(db_path):
|
||||
db_path = 'podcastrr.db'
|
||||
if not os.path.exists(db_path):
|
||||
print("Error: Database file not found.")
|
||||
return 1
|
||||
|
||||
print(f"Using database: {db_path}")
|
||||
|
||||
# Connect to the database
|
||||
try:
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Check if the podcasts table exists
|
||||
cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='podcasts'")
|
||||
if not cursor.fetchone():
|
||||
print("Error: The 'podcasts' table does not exist in the database.")
|
||||
print("You may need to initialize the database first with 'python init_db.py'")
|
||||
return 1
|
||||
|
||||
# Check if the tags column exists
|
||||
cursor.execute("PRAGMA table_info(podcasts)")
|
||||
columns = [column[1] for column in cursor.fetchall()]
|
||||
|
||||
if 'tags' in columns:
|
||||
print("Success: The 'tags' column exists in the 'podcasts' table.")
|
||||
print("The migration has been applied correctly.")
|
||||
return 0
|
||||
else:
|
||||
print("Error: The 'tags' column does not exist in the 'podcasts' table.")
|
||||
print("You need to run the migration with 'python run_migrations.py'")
|
||||
|
||||
# Ask if the user wants to run the migration now
|
||||
response = input("Do you want to run the migration now? (y/n): ")
|
||||
if response.lower() == 'y':
|
||||
print("Running migration...")
|
||||
import subprocess
|
||||
result = subprocess.run([sys.executable, 'run_migrations.py'],
|
||||
capture_output=True,
|
||||
text=True)
|
||||
print(result.stdout)
|
||||
|
||||
# Check if the migration was successful
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
cursor.execute("PRAGMA table_info(podcasts)")
|
||||
columns = [column[1] for column in cursor.fetchall()]
|
||||
|
||||
if 'tags' in columns:
|
||||
print("Success: The migration was applied successfully.")
|
||||
return 0
|
||||
else:
|
||||
print("Error: The migration failed to add the 'tags' column.")
|
||||
return 1
|
||||
else:
|
||||
return 1
|
||||
except Exception as e:
|
||||
print(f"Error: {str(e)}")
|
||||
return 1
|
||||
finally:
|
||||
if 'conn' in locals():
|
||||
conn.close()
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main())
|
Loading…
Add table
Add a link
Reference in a new issue