Compare commits

...

273 Commits
v0.3 ... main

Author SHA1 Message Date
DJObleezy
b9c04a39e8 Enhance pool fees display in dashboard
Updated the star rating display for pool fees in `dashboard.html`. When `metrics.pool_fees_percentage` is between 0.9 and 1.3, the number of stars shown has been increased from one to three, improving the visual representation of the metric.
2025-04-23 23:05:29 -07:00
DJObleezy
6ba7545278 Remove text-shadow for cleaner UI across stylesheets
This commit removes `text-shadow` properties from various CSS classes in `blocks.css`, `boot.css`, `common.css`, `dashboard.css`, `error.css`, `retro-refresh.css`, and `workers.css`, enhancing readability and reducing visual clutter.

In `main.js`, the inline styles for profit value elements are updated to eliminate `text-shadow`, simplifying styling logic.

Additionally, `theme.js` sees the removal of `text-shadow` from headers and interface elements, with adjustments made to box-shadow values for improved visual depth.

These changes reflect a design decision to create a more modern and streamlined interface.
2025-04-23 23:00:06 -07:00
DJObleezy
50b5241812 Remove version number from page titles
Updated titles in base.html, blocks.html, dashboard.html,
notifications.html, and workers.html to eliminate the
version number "v 0.3", creating a more concise and
uniform appearance across the application.
2025-04-23 22:03:29 -07:00
DJObleezy
54957babc3 Enhance services and improve code structure
- Added health check for Redis in `docker-compose.yml`.
- Introduced new environment variables for the dashboard service.
- Updated Redis dependency condition for the dashboard service.
- Modified Dockerfile to use Python 3.9.18 and streamlined directory creation.
- Enhanced `minify.py` with logging and improved error handling.
- Added methods in `OceanData` and `WorkerData` for better data handling.
- Improved error handling and logging in `NotificationService`.
- Refactored `BitcoinProgressBar.js` for better organization and theme support.
- Updated `blocks.js` with new helper functions for block data management.
- Enhanced `dashboard.html` for improved display of network stats.
2025-04-23 21:56:25 -07:00
DJObleezy
4c4750cb24 Enhance README.md with new features and API endpoints
Updated README.md to include:
- New **Error Handling** section for user-friendly error pages.
- Introduction of the **DeepSea Theme** with immersive effects and a toggle option.
- Added environment variables `NETWORK_FEE` and `TIMEZONE` in `docker-compose.yml`.
- New API endpoints for metrics, timezones, configuration, and health status.

These changes improve user experience, configuration options, and application resilience.
2025-04-23 20:06:14 -07:00
DJObleezy
2021583951 Add DeepSea theme styles and effects
Implemented new CSS styles for the DeepSea theme in `boot.css`, enhancing visual elements like colors, shadows, and animations for various components. Updated `common.css` to include a footer style and a theme loader for improved user experience. Removed inline styles from `base.html` and replaced them with references to the new styles. Added JavaScript to create dynamic underwater effects when the DeepSea theme is active. Updated footer to include a link to Ocean.xyz.
2025-04-23 20:00:27 -07:00
DJObleezy
00d3bbdee9 Update deployment instructions and chart indicator style
- Changed repository cloning instructions to use `DeepSea-Dashboard`.
- Adjusted the position of the `lowHashrateIndicator` in `main.js` from bottom to top right.
- Added a background color to the indicator for improved visibility.
2025-04-23 18:47:50 -07:00
DJObleezy
3ebef744dc
Update deployment_steps.md 2025-04-23 14:33:20 -07:00
DJObleezy
f302d35945
Update README.md 2025-04-23 14:32:24 -07:00
DJObleezy
28099c37ec
Update README.md 2025-04-23 14:31:03 -07:00
DJObleezy
2cb166b405
Update README.md 2025-04-23 14:28:15 -07:00
DJObleezy
259c877f69 Update project structure and enhance documentation
Added new services: `notification_service.py`, `minify.py`, and `theme-toggle.css`. Renamed `config.json` for clarity and updated `workers.html` to `workers dashboard.html`. Introduced `LICENSE.md` and improved project structure documentation. Adjusted formatting in the "Component Interactions" diagram for consistency.
2025-04-23 14:23:07 -07:00
DJObleezy
3d62c42bbf Update project structure with new services and files
Added notification_service.py and minify.py for enhanced functionality. Introduced docker-compose.yml for easier orchestration. Updated templates with notifications.html and added theme-toggle.css and notifications.js in static assets. Retained existing styles and functionalities. Moved project_structure.md, added LICENSE.md, and created logs/ directory for runtime logs.
2025-04-23 14:20:19 -07:00
DJObleezy
076fba75c8 Update README with new config.json URL
Changed the link to the configuration file in README.md
to point to the new repository for the Deepsea Dashboard.
This reflects the project's renaming and ensures users
access the correct configuration settings.
2025-04-23 14:16:46 -07:00
DJObleezy
a3acee1782 Enhance z-index and mobile styles for UI elements
- Added `z-index` to `body::before` for proper stacking.
- Implemented mobile-specific styles for `#skip-button`.
- Established higher `z-index` for `#config-form` with relative positioning.
2025-04-23 14:12:12 -07:00
DJObleezy
a71a6ce03a Add theme change listener and first startup handling
- Implement `setupThemeChangeListener` in `main.js` to detect theme changes across tabs, save font configurations, and recreate the chart with appropriate styles for mobile and desktop.
- Introduce new functions in `theme.js` to manage theme preferences on first startup, including setting the DeepSea theme as default and checking for previous app launches.
2025-04-23 13:55:19 -07:00
DJObleezy
f617342c23 Add theme loading styles and improve theme management
Updated `theme-toggle.css` with new styles for DeepSea and Bitcoin themes, including a loading screen. Introduced `isApplyingTheme` flag in `main.js` to manage theme application state. Modified `applyDeepSeaTheme` and `toggleTheme` functions in `theme.js` to enhance theme switching experience with dynamic loading messages. Enhanced `base.html` to preload styles and prevent flickering during theme transitions.
2025-04-23 13:26:07 -07:00
DJObleezy
2b09ad6c15 Add DeepSea Theme with ocean effects and glitch animations
Implemented CSS styles and animations for an "Ocean Wave Ripple Effect" and a "Retro Glitch Effect" in the new "DeepSea Theme". Added keyframes, background images, and opacity settings to create underwater light rays and digital noise. Included JavaScript to dynamically generate elements for these effects when the theme is active, enhancing user experience.
2025-04-23 10:12:44 -07:00
DJObleezy
f8514eb35f Changed default wallet
Update wallet address and improve user flow

- Changed the `WALLET` environment variable in `docker-compose.yml` to a new Bitcoin address.
- Updated the wallet address in the "Use Defaults" button handler in `boot.html`.
- Modified the logic for user selection of 'N' to display the configuration form directly instead of redirecting to the dashboard.
- Updated fallback messages to reflect the new title "MINING CONTROL SYSTEM".
2025-04-23 10:01:53 -07:00
DJObleezy
f2ddcdd63a
Update README.md 2025-04-23 09:55:04 -07:00
DJObleezy
f1bf5d0582
Update README.md 2025-04-23 09:54:19 -07:00
DJObleezy
5770c96bf7
Update README.md 2025-04-23 08:46:03 -07:00
DJObleezy
a02837b28c
Update README.md 2025-04-23 08:12:38 -07:00
DJObleezy
41883f3f9c Enhance Bitcoin logo styling in DeepSea theme
Added fixed height and flexbox properties to center the Bitcoin logo. Adjusted positioning of the DeepSea ASCII art for perfect centering within the logo area.
2025-04-23 07:40:16 -07:00
DJObleezy
cb24f54685 Shorten version info in deepsea theme
Updated the `#bitcoin-logo::before` pseudo-element to change the displayed version text from "DeepSea v.21" to "v.21", simplifying the version information.
2025-04-23 07:31:44 -07:00
DJObleezy
e02622d600 Enhance Bitcoin logo styling in DeepSea theme
- Added base styling for the Bitcoin logo with positioning and font settings.
- Updated logo styling to hide the original and accommodate new height.
- Introduced ASCII art for the Bitcoin logo with enhanced visual effects.
- Added "DeepSea" version info label for better branding visibility.
2025-04-23 07:26:40 -07:00
DJObleezy
a802880011 Enhance theme styling and mobile responsiveness
Updated `theme-toggle.css` for improved mobile styling of the theme toggle button, increasing padding and width. Reorganized CSS variables in `theme.js` for the DeepSea theme to enhance structure and readability. Adjusted color selectors for consistency across elements, including pool hashrate and navigation links, and modified button hover effects to utilize new primary color variables for a cohesive theme.
2025-04-23 06:58:56 -07:00
DJObleezy
df1678fac7 Increase font size for datum labels in dashboard.css
Updated the font size of the `.datum-label` class from `0.85em` to `0.95em` to enhance readability.
2025-04-22 22:38:03 -07:00
DJObleezy
6d3f873d6b Refactor text styling in dashboard.css
Removed padding, vertical alignment, and adjusted letter-spacing. Increased letter-spacing to 2px for improved text appearance.
2025-04-22 22:36:39 -07:00
DJObleezy
c7e2f0f4a9 Update .datum-label color to white
Changed the color of the `.datum-label` class from orange (`#ff9d00`) to white (`#ffffff`) for improved visibility.
2025-04-22 22:33:19 -07:00
DJObleezy
65d4deba5e Update theme toggle styles for better usability
Adjusted button border-radius for mobile view and added
a space in the content string for improved visual design.
2025-04-22 22:28:28 -07:00
DJObleezy
af3ea9607e
Update README.md 2025-04-22 21:47:53 -07:00
DJObleezy
eb95e6c6b5 Add theme toggle feature and configuration updates
- Introduced `theme-toggle.css` and `theme.js` to support a new theme toggle feature.
- Updated default configuration to include timezone and network fee.
- Enhanced command line arguments for network fee, timezone, and theme selection.
- Modified `create_config` to handle new configuration values from command line.
- Updated logging to reflect new network fee and timezone settings.
- Changed theme icons in `theme-toggle.css` for desktop and mobile views.
2025-04-22 20:39:45 -07:00
DJObleezy
f5e93f436b Implement DeepSea theme with CSS enhancements
Replaced the `applyDeepSeaTheme` function in `main.js` to apply a cohesive DeepSea theme, including extensive CSS variable definitions for UI elements.

In `theme.js`, updated styles to ensure visibility of pool hashrate text, enhanced button hover effects, and added direct DOM manipulation for consistent styling.

These changes improve the overall visual consistency and user experience of the application.
2025-04-22 20:10:10 -07:00
DJObleezy
e87993a252 Refactor theme toggle button positioning and styles
Updated the theme toggle button's positioning from `fixed` to `absolute` for consistency with `topRightLink`. Adjusted top and left positions, clarified media queries for desktop and mobile styles, and modified mobile button dimensions and padding. Reduced icon font size for better fit. These changes enhance the layout and responsiveness across different screen sizes.
2025-04-22 17:57:11 -07:00
DJObleezy
0d0a707019 Add responsive theme toggle and dynamic styling
Introduces a responsive theme toggle button with styles for desktop and mobile views in `theme-toggle.css`. Updates `BitcoinProgressBar.js` to support dynamic theme changes and adds a new `updateTheme` method. Enhances `main.js` for theme management based on user preferences in `localStorage`. Modifies `base.html` and other HTML files to include the theme toggle button and necessary scripts. Introduces `theme.js` for managing theme constants and applying the DeepSea theme.
2025-04-22 17:43:46 -07:00
DJObleezy
2142a7d2af Enhance generate_fallback_data method in WorkerService
Updated docstring to include argument and return type.
Added handling for None value in workers_count, defaulting to 1 worker.
Modified condition to use elif for clearer logic in worker count checks.
2025-04-22 14:06:11 -07:00
DJObleezy
231a56b18a Remove pool_luck metric and UI indicator
Updated the ArrowIndicator class to exclude the "pool_luck" metric while keeping "estimated_rewards_in_window_sats" and "workers_hashing". Removed the corresponding UI elements in the updateUI function, as the visual representation for "pool_luck" is no longer necessary.
2025-04-22 08:30:03 -07:00
DJObleezy
8c1c55c83f Remove duplicated saveConfig function from boot.html 2025-04-22 07:48:47 -07:00
DJObleezy
bdb9552576 Add network fee support to dashboard configuration
Updated the `reset_chart_data` function to include a new `network_fee` parameter in the `MiningDashboardService`. Modified `config.json` to add a default `network_fee` key. Enhanced `load_config` in `config.py` to handle the new parameter. Updated the `MiningDashboardService` constructor and `fetch_metrics` method to utilize the `network_fee` in calculations. Added a new input field for `network_fee` in `boot.html` and updated related JavaScript functions to manage this input. Improved the "Use Defaults" button functionality to reset the `network_fee` to its default value.
2025-04-22 07:43:57 -07:00
DJObleezy
b8321fe3b0 Enhance pool fees percentage display logic
Updated conditional checks in `dashboard.html` to ensure
`metrics.pool_fees_percentage` is not `none` before display.
Added validation for percentage range (0.9 to 1.3) to show
star icon and "DATUM" label, improving data accuracy and
presentation.
2025-04-22 06:22:59 -07:00
DJObleezy
ce3b186dc5 Cleaned up Earning Efficiency, Time To Block, and Block Odds in the Pool Hashrates card. Also fixed a display error on the Last Block line in the Payout Info card. 2025-04-21 22:54:45 -07:00
DJObleezy
eab3e89a11 Update spacing for block odds in UI
Increased margin-left for the `probSpan` element in the
`updateUI` function from 10px to 17px for better spacing.
Removed the inline margin-left style from the `block_odds_3hr`
span in `dashboard.html`, promoting a more consistent style
management through JavaScript.
2025-04-21 10:17:33 -07:00
DJObleezy
7267244e94 Enhance block finding metrics and UI display
Updated CSS for improved styling, added functions to calculate block finding probability and time, and modified UI to display these metrics based on the 24-hour hashrate. New HTML elements added to the dashboard for better user visibility of block odds.
2025-04-21 10:04:54 -07:00
DJObleezy
3bb74c37e7 Remove acceptance rate tracking from mining services
This commit removes the `avg_acceptance_rate` and `acceptance_rate` fields from both the `MiningDashboardService` and `WorkerService` classes. The changes simplify the data structures and calculations related to worker data, as acceptance rates are no longer tracked or displayed in the mining dashboard. This includes the removal of default values and random generation of acceptance rates for workers.
2025-04-20 06:20:38 -07:00
DJObleezy
9a9f9ae178 Remove acceptance rate display from worker stats
Eliminated the acceptance rate section from the `createWorkerCard` function in `workers.js`, including its corresponding HTML in `workers.html`. Updated the `updateSummaryStats` function to remove references to the average acceptance rate.
2025-04-19 16:57:05 -07:00
DJObleezy
4f52697185 Update mempool URL comment and dashboard metrics
Clarified the use of "mempool.guide" in `blocks.js` to align with Ocean.xyz ethos.

In `dashboard.html`, replaced the "Pool Fees" section with "Blocks Found," including logic to display the number of blocks found, defaulting to "0" if not defined. Removed associated pool fees logic and updated indicators accordingly.
2025-04-19 09:35:38 -07:00
DJObleezy
f6b3fdb094 Update data source from mempool.space to mempool.guide
This commit updates all references from "mempool.space" to "mempool.guide" in multiple files, including README.md, project_structure.md, blocks.js, and blocks.html.
2025-04-19 06:26:02 -07:00
DJObleezy
f1eb0e22b9
Update README.md 2025-04-18 19:42:47 -07:00
DJObleezy
ee469866d7
Update README.md 2025-04-18 13:27:08 -07:00
DJObleezy
6d07060b7e Add API for resetting chart data and update frontend
Implemented a new API endpoint `/api/reset-chart-data` in `App.py` to clear chart data history and save state to Redis. Updated the `resetDashboardChart` function in `main.js` to make an AJAX call to this endpoint, providing immediate user feedback. Removed previous logic for handling latest metrics to streamline the reset process.
2025-04-18 12:27:20 -07:00
DJObleezy
f166126525 Add timezone support to last updated timestamp formatting
Updated the `updateLastUpdated()` function in `workers.js` to include timezone configuration for formatting the last updated timestamp. Introduced a `configuredTimezone` variable with a default value of 'America/Los_Angeles'. The timestamp is now formatted using this timezone, and a console log statement indicates the timezone used. Added a fallback to the current date and time in case of formatting errors.
2025-04-18 11:28:18 -07:00
DJObleezy
97fe19d61d Add configurable timezone support throughout the app
Updated the application to use a configurable timezone instead of hardcoding "America/Los_Angeles". This change impacts the dashboard, API endpoints, and worker services. Timezone is now fetched from a configuration file or environment variable, enhancing flexibility in time display. New API endpoints for available timezones and the current configured timezone have been added. The frontend now allows users to select their timezone from a dropdown menu, which is stored in local storage for future use. Timestamps in the UI have been updated to reflect the selected timezone.
2025-04-18 11:08:35 -07:00
DJObleezy
96a71ec80d
Update README.md 2025-04-18 09:08:26 -07:00
DJObleezy
9a5c93036a Enhance .offline-dot style specificity
Added !important to the box-shadow property in the
.offline-dot class in common.css to ensure it takes
precedence over conflicting styles while keeping other
properties unchanged.
2025-04-17 19:43:28 -07:00
DJObleezy
eef2414ae2
Update README.md 2025-04-17 17:29:33 -07:00
DJObleezy
06f9d8a4b0
Update README.md 2025-04-17 17:19:18 -07:00
DJObleezy
c52001175b
Update README.md 2025-04-17 17:18:32 -07:00
DJObleezy
014b0acc24 Add fee indicator styling and conditional display
Implemented CSS styles for an optimal fee indicator, including a gold star and a label. Updated HTML to conditionally show the star and "DATUM" label when the pool fees percentage is between 0.9 and 1.3.
2025-04-17 15:27:30 -07:00
DJObleezy
982fe295d2 Add pool fees percentage metric to dashboard
Updated `MiningDashboardService` to calculate and display
the `pool_fees_percentage` metric, reflecting earnings lost
to pool fees. Enhanced error handling for earnings processing.
Updated styles in `dashboard.css` for the new metric and
added corresponding HTML elements in `dashboard.html` to
ensure proper display and conditional rendering.
2025-04-17 15:00:11 -07:00
DJObleezy
d98d496bd6 Limit worker data fetching to 10 pages
Updated the `get_all_worker_rows` method in the `MiningDashboardService` class to restrict the number of pages fetched to a maximum of 10. Enhanced logging to provide clearer information about the current page and maximum limit, and added a log message for when the maximum page limit is reached to improve visibility during data collection.
2025-04-17 10:05:51 -07:00
DJObleezy
0bb90b1aca 2025-04-17 08:45:03 -07:00
DJObleezy
4a3e0c96da Revert "Enhance pool info display in latest block stats"
This reverts commit 1554b0b0c5.
2025-04-17 08:45:03 -07:00
DJObleezy
17446bddc1
Delete ocean_scraper.py 2025-04-17 08:43:38 -07:00
DJObleezy
1d84bb08a8
Add files via upload 2025-04-16 23:23:07 -07:00
DJObleezy
4e7aace5d8 Refactor data retrieval to use web scraping
This commit removes the `OceanAPIClient` and introduces the `OceanScraper` for data retrieval in the mining dashboard application. Key changes include:
- Updated `App.py` to import `OceanScraper`.
- Enhanced `data_service.py` to reflect the transition to web scraping, including updates to the `MiningDashboardService` class.
- Improved methods for fetching metrics and worker data with better error handling and logging.
- Preserved the original web scraping method as a fallback.
- Removed the `ocean_api_client.py` file
- Added a new `ocean_scraper.py` file with comprehensive scraping functionality.
2025-04-16 22:05:12 -07:00
DJObleezy
60376e7395 Integrate Ocean API for enhanced metrics and worker data
Added `OceanAPIClient` to facilitate API interactions in `App.py`.
Modified `update_metrics_job` to include API status checks and fetch metrics.
Introduced `/api/check-api` endpoint for Ocean API health checks.
Updated `MiningDashboardService` to initialize the API client and fetch data directly from the Ocean API, with fallbacks to web scraping.
Refactored data retrieval methods to prioritize API calls and added error handling.
Enhanced logging for API interactions and created a new module `ocean_api_client.py` for encapsulating API logic.
Implemented retry mechanisms for API requests and updated data processing to align with the new API response structure.
2025-04-16 20:37:35 -07:00
DJObleezy
0a32b492b8 Improve API connectivity and error handling
Updated `MiningDashboardService` in `data_service.py` to enhance API connectivity testing and error handling. Introduced `_api_request_with_retry` for retry logic on API requests, and modified `_test_api_connectivity` to log detailed connectivity test information. Refactored multiple API calls to utilize the new retry method, improving reliability when fetching user hashrate data, pool stats, and other metrics.
2025-04-16 17:58:06 -07:00
DJObleezy
c9a2f927ff Improve API connectivity checks in MiningDashboardService
Updated the `_test_api_connectivity` method in `data_service.py` to include additional headers and enhanced logging. The method now attempts to ping a wallet-specific endpoint first, followed by a standard ping and a statsnap endpoint if necessary. Detailed error messages and a debug URL have been added for better troubleshooting, improving the overall robustness and clarity of the connectivity checks.
2025-04-16 17:46:47 -07:00
DJObleezy
f52b947633 Integrate Ocean.xyz API into MiningDashboardService
Updated `MiningDashboardService` in `data_service.py` to incorporate a new API from Ocean.xyz. Added base URL, connectivity test method, and new data-fetching methods. Existing methods modified to use the API when available, enhancing data retrieval efficiency. Improved error handling and logging, while retaining original web scraping methods as fallbacks.
2025-04-16 17:35:20 -07:00
DJObleezy
1554b0b0c5 Enhance pool info display in latest block stats
Updated the `updateLatestBlockStats` function to improve the presentation of pool information. Added color coding for pool names, highlighting "Ocean" pools with a star icon and special styling. The function now checks for pool name availability and adjusts the stats card styling accordingly, resetting to default for non-Ocean pools. Default display is set to "Unknown" if no pool information is available.
2025-04-15 21:53:11 -07:00
DJObleezy
d8f3972e03 Remove createBlockCard function from blocks.js
The `createBlockCard` function, which was responsible for generating block card elements displaying details such as timestamp, size, transaction count, miner/pool information, and average fee rate, has been completely removed. This change eliminates the functionality to display block cards in the codebase.
2025-04-15 21:49:44 -07:00
DJObleezy
9d5184e5c7 Add pool color mapping and enhance UI styling
Introduced a new `getPoolColor(poolName)` function to map mining pool names to specific colors, improving visual representation in the UI. Updated `createBlockCard(block)` and `showBlockDetails(block)` functions to utilize this new function, applying distinct styles for Ocean pools. These changes enhance user experience by providing clear, color-coded cues for different mining pools.
2025-04-15 21:44:45 -07:00
DJObleezy
5664e44cd9 Refactor daily stats posting logic
Updated the `_should_post_daily_stats` method to clarify that it checks for posting once per day at 12 PM. Simplified the logic to focus solely on this target time, requiring it to be a different day and within the first 5 minutes of 12 PM for posting. Adjusted the first-time posting condition to specifically check for 12 PM.
2025-04-15 21:31:19 -07:00
DJObleezy
1dec2ba35b Add explorer link to block details modal
This update introduces a new feature in `blocks.js` that adds an "Explorer Link" to view block details on the mempool.space website. The link is styled for visibility and includes an external link indicator. Additional comments were added for clarity, and minor adjustments were made to the existing code structure, ensuring the overall functionality of the block details modal remains intact.
2025-04-15 21:24:21 -07:00
DJObleezy
1eb17aed80 Refactor header styles and update last updated info
- Decreased font size of `h1` in `common.css` and added padding for better usability.
- Introduced hover effects for the top right link.
- Removed link from `h1` in `base.html` and added a block to display the last updated time.
2025-04-12 21:01:32 -07:00
DJObleezy
93175a7b40 Remove block mining animation SVG file
The entire SVG file for the block mining animation has been deleted. This includes all graphical elements, animations, and scripts related to the background, blockchain representation, Bitcoin logo, current block details, mining animation, status display, timestamp display, and CRT flicker animation.
2025-04-12 20:49:51 -07:00
DJObleezy
af89590b54
Update dashboard.html 2025-04-12 20:31:55 -07:00
DJObleezy
a2e253af93
Update main.js 2025-04-12 20:31:42 -07:00
DJObleezy
827f71651a
Update dashboard.css 2025-04-12 20:31:28 -07:00
DJObleezy
d9d239e06b Update dashboard return link in error page
Changed the return link from the root URL ("/") to the specific dashboard URL ("/dashboard") for better navigation.
2025-04-12 20:01:20 -07:00
DJObleezy
a4178d0a9b Removed Console 2025-04-12 19:59:57 -07:00
DJObleezy
695e4cfb95 Change unpaid earnings display to BTC format
Updated the `logCurrentStats` function to convert unpaid earnings from SATS to BTC. The new implementation divides the unpaid earnings by 100,000,000 and formats the result to 8 decimal places, improving clarity by using a more recognized cryptocurrency unit.
2025-04-12 17:37:20 -07:00
DJObleezy
13294b6d72 Improve log message formatting in logCurrentStats
Updated the `logCurrentStats` function to enhance the display of daily profit and unpaid earnings. Daily profit now defaults to '0.00' instead of 'CALCULATING...', and unpaid earnings are parsed as integers. Additionally, power consumption now shows '0 WATTS' instead of 'N/A W' when not available.
2025-04-12 17:29:37 -07:00
DJObleezy
6c7e986a80 Refactor logging mechanism for periodic stats
Updated `processLogQueue` to log periodic stats when the queue is empty. Replaced the switch-case structure in `logCurrentStats` with an array of log messages, which are randomized and queued for display. Added `shuffleArray` helper function and removed specific logging for unpaid balances to streamline the process.
2025-04-12 16:04:44 -07:00
DJObleezy
7646149bb9 Refactor metric update system to log update system
Renamed `metricUpdateQueue` to `logUpdateQueue` and updated related functions to reflect this change. Introduced `logInterval` for processing updates every 2 seconds. Modified efficiency calculation in `logCurrentStats` to use `metrics.hashrate_60sec` for accuracy.
2025-04-12 15:57:57 -07:00
DJObleezy
3a9213d52d Update Bitcoin Mining Console for real-time metrics
- Clarified comments to emphasize real-time data display.
- Modified `consoleSettings` to include `refreshInterval`.
- Changed initialization message for the console.
- Enhanced `setupEventSource` with connection logging and error handling.
- Updated `fetchMetrics` to log connection and error messages.
- Introduced `processMetricChanges` to handle significant metric updates.
- Added `logCurrentStats` for periodic logging of mining statistics.
- Implemented `queueMetricUpdate` to manage console message display.
- Retained and adjusted `adjustConsoleLayout` for proper layout on load.
- Updated HTML title and copyright information for branding.
2025-04-12 15:50:24 -07:00
DJObleezy
79c80e2cec Update console styles and version information
- Added white text color to `.stat-value` in console.css for better visibility.
- Updated Bitcoin Mining Terminal version to `21.0000` in console.html.
- Changed copyright notice to "BTC OS - Mining Operations".
- Removed current time display elements from the console.
2025-04-12 15:41:47 -07:00
DJObleezy
60d716f429 Adjust console layout for improved responsiveness
Modified `console.css` to change body and container heights for better spacing. Added maximum height and margins to `.console-container`. Updated `.console-wrapper` to have a calculated height based on available space.

Introduced `adjustConsoleLayout` function in `console.js` to dynamically adjust wrapper height based on viewport size, ensuring consistent layout on load and resize.
2025-04-12 13:58:59 -07:00
DJObleezy
a60d21521d Improve console layout and responsiveness
Updated `console.css` to enhance the layout and responsiveness of the console interface. Adjusted the console container to utilize the full viewport height while maintaining a controlled height for the console wrapper. Changed console output positioning to relative for better spacing and positioned the stats bar at the bottom for consistency. Added a new JavaScript function `adjustConsoleLayout` to dynamically calculate and set the height of the console wrapper based on viewport size, improving user experience across different screen sizes.
2025-04-12 13:53:27 -07:00
DJObleezy
f718647966 Improve console layout and update branding
Updated CSS for better height management and responsiveness.
Adjusted `.console-container`, `.console-wrapper`, and `.console-output` for improved layout stability and overflow handling.
Set minimum heights for `.console-stats` and added padding for better spacing.
Changed title from "BITCOIN MINING TERMINAL v1.0" to "BTC OS LOG TERMINAL v1.0".
2025-04-12 13:44:30 -07:00
DJObleezy
898652b754 Refactor console.js for enhanced metrics reporting
Updated consoleSettings to reduce hashRateFluctuation from 10% to 1%. Improved generateLog function comments and significantly enhanced generateSystemMessage to include real-time metrics on power consumption, system health, processing capacity, and revenue projections. Added random messages for variety and introduced alerts for offline workers and negative profitability, improving the logging system's functionality and responsiveness.
2025-04-12 13:34:15 -07:00
DJObleezy
308abcce5f Disable auto margin in console.css
Removed the `margin: 0 auto;` line and replaced it with a commented-out version to keep it for reference while disabling the margin setting.
2025-04-12 13:28:24 -07:00
DJObleezy
6a63add833 Enhance console styling with retro CRT theme
Updated `console.css` to implement a retro CRT theme, featuring new background colors, gradient effects, and flicker animations. Improved styles for console elements and added neon-inspired color classes for messages. Included media queries for mobile responsiveness.

Moved CSS link in `console.html` to a new block for better organization and structure.
2025-04-12 13:17:31 -07:00
DJObleezy
545e5a9d92 Add console asset mappings in setup.py
This commit introduces new file mappings for `console.css`, `console.js`, and `console.html` in the `setup.py` file. These additions ensure that the console-related assets are included in the project.
2025-04-12 13:06:51 -07:00
DJObleezy
886e595ef4 Add Bitcoin mining console page and related assets
This commit introduces a new route in `App.py` for a retro-styled console log page that displays real-time Bitcoin mining metrics. It includes a new CSS file, `console.css`, for styling with effects like CRT and text glitch animations. The `console.js` file is added to handle log generation, metrics fetching, and real-time updates. Additionally, a new `console.html` file is created to structure the console page, integrating the necessary styles and scripts.
2025-04-12 13:02:51 -07:00
DJObleezy
c3de5544ef Enhance negative profit styling with text shadow
Added a `text-shadow` property to the styling of
`dailyProfitElement` and `monthlyProfitElement`
to improve visibility when profit values are negative.
Existing styles for color and font-weight remain unchanged.
2025-04-11 22:56:49 -07:00
DJObleezy
6f5b2ec359 Improve style handling in updateUI function
Updated the `updateUI` function to use `setAttribute` for applying styles with `!important` to `dailyProfitElement` and `monthlyProfitElement`. Changed the reset logic to remove the entire style attribute instead of setting it to an empty string, ensuring complete style clearance for positive profit values.
2025-04-11 22:52:45 -07:00
DJObleezy
08034ea9a7 Enhance profit color visibility in updateUI function
Updated the `style.color` property for negative profit
elements in the `updateUI` function to include the
`!important` flag. This change ensures that the red color
(`#ff5555`) takes precedence over conflicting styles,
improving the visibility of negative profit indicators
in the user interface.
2025-04-11 22:47:01 -07:00
DJObleezy
9681077fbd Enhance UI updates for revenue and notifications
Updated `updateUI` to display daily revenue and power cost with conditional formatting for negative profits. Removed previous profit updates. Introduced `updateNotificationBadge` function to fetch unread notifications count via AJAX.
2025-04-11 22:37:00 -07:00
DJObleezy
fc7cc6e0c5
Update setup.py 2025-04-10 20:09:36 -07:00
DJObleezy
6cb74188b1
Update BitcoinProgressBar.js 2025-04-10 16:52:07 -07:00
DJObleezy
05033f12ad
Update main.js 2025-04-10 16:40:57 -07:00
DJObleezy
d491220a4d
Update App.py 2025-04-10 07:17:10 -07:00
DJObleezy
8cc012219b
Update workers.html 2025-04-10 06:42:18 -07:00
DJObleezy
287b9fde6b
Update App.py 2025-04-09 21:20:07 -07:00
DJObleezy
a98c488eb6
Update main.js 2025-04-09 13:17:20 -07:00
DJObleezy
4123007c34
Update README.md 2025-04-09 11:57:42 -07:00
DJObleezy
2919de9dae
Update README.md 2025-04-09 11:57:06 -07:00
DJObleezy
a0690b5739
Update README.md 2025-04-09 11:56:03 -07:00
DJObleezy
30907f26ff
Update README.md 2025-04-09 11:54:33 -07:00
DJObleezy
895487bbd4
Update README.md 2025-04-09 11:53:45 -07:00
DJObleezy
8c2fc76e35
Update README.md 2025-04-09 11:53:04 -07:00
DJObleezy
cee6eec403
Update README.md 2025-04-09 11:52:18 -07:00
DJObleezy
3f867264ac
Update README.md 2025-04-09 11:48:22 -07:00
DJObleezy
28785b915e
Update README.md 2025-04-09 11:47:13 -07:00
DJObleezy
4be19833d7
Add files via upload 2025-04-09 11:46:24 -07:00
DJObleezy
f73a1825b6
Delete worker_service.py 2025-04-09 11:45:17 -07:00
DJObleezy
c8a62971e2
Delete state_manager.py 2025-04-09 11:45:12 -07:00
DJObleezy
b64c482c99
Delete setup.py 2025-04-09 11:45:06 -07:00
DJObleezy
461b541a7a
Delete requirements.txt 2025-04-09 11:45:00 -07:00
DJObleezy
cd5919d167
Delete project_structure.md 2025-04-09 11:44:54 -07:00
DJObleezy
605d7f15c0
Delete notification_service.py 2025-04-09 11:44:48 -07:00
DJObleezy
64e41140a6
Delete models.py 2025-04-09 11:44:42 -07:00
DJObleezy
7996d288bf
Delete dockerfile 2025-04-09 11:44:36 -07:00
DJObleezy
674801a2d2
Delete deployment_steps.md 2025-04-09 11:44:31 -07:00
DJObleezy
541ff9a73a
Delete data_service.py 2025-04-09 11:44:24 -07:00
DJObleezy
b11b64c38f
Delete config.py 2025-04-09 11:44:18 -07:00
DJObleezy
484a087250
Delete config.json 2025-04-09 11:44:11 -07:00
DJObleezy
026b86c255
Delete LICENSE.md 2025-04-09 11:44:05 -07:00
DJObleezy
2b7b3d66f8
Delete README.md 2025-04-09 11:43:58 -07:00
DJObleezy
cc2cd6354f
Delete App.py 2025-04-09 11:43:33 -07:00
DJObleezy
4845f42fa4
Delete templates directory 2025-04-09 11:43:14 -07:00
DJObleezy
d374bc3ba1
Delete static directory 2025-04-09 11:43:06 -07:00
DJObleezy
35dd182eb2
Update main.js 2025-04-04 14:20:44 -07:00
DJObleezy
c7e0af6431
Update main.js 2025-04-02 13:39:57 -07:00
DJObleezy
1416cf2bf5
Update notification_service.py 2025-04-02 13:39:33 -07:00
DJObleezy
353c9567bb
Update dashboard.css 2025-04-02 13:39:09 -07:00
DJObleezy
504eb88077
Update App.py 2025-04-02 13:37:05 -07:00
DJObleezy
1a99c93ec9
Update main.js 2025-04-01 06:00:47 -07:00
DJObleezy
13a38d351e
Update README.md 2025-03-30 21:42:50 -07:00
DJObleezy
315de329c2
Update workers.js 2025-03-30 21:11:16 -07:00
DJObleezy
c1d32f4cfc
Update workers.html 2025-03-30 21:03:04 -07:00
DJObleezy
e21b7be7a0
Update workers.js 2025-03-30 21:02:45 -07:00
DJObleezy
a2a661797d
Update README.md 2025-03-30 18:33:00 -07:00
DJObleezy
cfce1191cd
Update README.md 2025-03-30 18:30:37 -07:00
DJObleezy
11c8255a18
Update README.md 2025-03-30 15:57:36 -07:00
DJObleezy
c4d399748d
Update boot.html 2025-03-30 15:49:30 -07:00
DJObleezy
58cbc2a02c
Update main.js 2025-03-30 15:25:13 -07:00
DJObleezy
b15dfc8691
Update worker_service.py 2025-03-30 10:15:22 -07:00
DJObleezy
9988913861
Update App.py 2025-03-30 10:11:24 -07:00
DJObleezy
f812a41bca
Update App.py 2025-03-30 10:10:11 -07:00
DJObleezy
2023256113
Update README.md 2025-03-30 09:04:50 -07:00
DJObleezy
ff84f8879a
Update README.md 2025-03-30 09:04:19 -07:00
DJObleezy
4bbfcb70fe
Update README.md 2025-03-30 09:03:10 -07:00
DJObleezy
bbe2cced4d
Update README.md 2025-03-30 09:02:04 -07:00
DJObleezy
830ea8e917
Update main.js 2025-03-30 09:00:40 -07:00
DJObleezy
2cc3d16830
Update main.js 2025-03-30 06:21:44 -07:00
DJObleezy
1c4a2cfbdf
Update setup.py 2025-03-29 21:32:04 -07:00
DJObleezy
fb5dd76368
Update README.md 2025-03-29 21:31:15 -07:00
DJObleezy
e5fe4974f5
Update App.py 2025-03-29 21:28:10 -07:00
DJObleezy
8b8cfa5ff1
Update boot.html 2025-03-29 21:27:27 -07:00
DJObleezy
ac52bb8579
Update config.py 2025-03-29 21:25:58 -07:00
DJObleezy
2a08efdfdb
Update config.json 2025-03-29 21:25:20 -07:00
DJObleezy
04d80008c7
Update README.md 2025-03-29 21:23:36 -07:00
DJObleezy
a785aac643
Update blocks.html 2025-03-29 20:52:16 -07:00
DJObleezy
9ad41b5fff
Update workers.css 2025-03-29 20:51:51 -07:00
DJObleezy
eea4de4d57
Update blocks.css 2025-03-29 20:51:39 -07:00
DJObleezy
115e59b8ed
Update main.js 2025-03-29 20:33:03 -07:00
DJObleezy
fd412b9f22
Update BitcoinProgressBar.js 2025-03-29 20:32:40 -07:00
DJObleezy
8d40fb83a8
Update blocks.js 2025-03-29 20:32:18 -07:00
DJObleezy
495e272843
Update workers.js 2025-03-29 20:31:35 -07:00
DJObleezy
f4a4da4679
Update data_service.py 2025-03-29 15:29:52 -07:00
DJObleezy
6331ea2737
Update main.js 2025-03-28 19:57:46 -07:00
DJObleezy
49f696b136
Update notification_service.py 2025-03-28 19:42:19 -07:00
DJObleezy
8b59c0e784
Update models.py 2025-03-28 19:35:20 -07:00
DJObleezy
bfacc24734
Update data_service.py 2025-03-28 19:34:53 -07:00
DJObleezy
cc349dd0cb
Update workers.html 2025-03-28 19:34:23 -07:00
DJObleezy
fe7d19c43f
Update worker_service.py 2025-03-28 19:34:03 -07:00
DJObleezy
9d8355eb7b
Update workers.js 2025-03-28 19:33:40 -07:00
DJObleezy
8e6af4043a
Update worker_service.py 2025-03-28 19:25:59 -07:00
DJObleezy
33a96600a4
Update worker_service.py 2025-03-28 19:21:55 -07:00
DJObleezy
f89b7b2825
Update worker_service.py 2025-03-28 19:17:46 -07:00
DJObleezy
9656368478
Update BitcoinProgressBar.js 2025-03-28 19:14:02 -07:00
DJObleezy
7aee1fe982
Update block-animation.js 2025-03-28 19:13:46 -07:00
DJObleezy
401655c3ae
Update main.js 2025-03-28 19:13:21 -07:00
DJObleezy
7b4bd5344c
Update notifications.js 2025-03-28 19:13:04 -07:00
DJObleezy
514644a3ef
Update workers.js 2025-03-28 19:12:37 -07:00
DJObleezy
d4981d91e2
Update base.html 2025-03-28 19:12:17 -07:00
DJObleezy
c54b618880
Update blocks.html 2025-03-28 19:11:57 -07:00
DJObleezy
c18a85693c
Update notifications.html 2025-03-28 19:10:59 -07:00
DJObleezy
62703f1b4e
Update workers.html 2025-03-28 19:10:43 -07:00
DJObleezy
1a9b09afae
Update App.py 2025-03-28 19:09:59 -07:00
DJObleezy
b651320bfb
Update data_service.py 2025-03-28 19:09:07 -07:00
DJObleezy
dd3d94f2b7
Update models.py 2025-03-28 19:08:27 -07:00
DJObleezy
2fec01a990
Update setup.py 2025-03-28 19:08:06 -07:00
DJObleezy
98b29aa521
Update state_manager.py 2025-03-28 19:07:29 -07:00
DJObleezy
2b05ec885a
Update worker_service.py 2025-03-28 19:07:03 -07:00
DJObleezy
651ed80bbd
Update BitcoinProgressBar.js 2025-03-27 16:09:33 -07:00
DJObleezy
cbcaaec237
Update retro-refresh.css 2025-03-27 16:09:13 -07:00
DJObleezy
b4b6995cc1
Update state_manager.py 2025-03-27 09:51:42 -07:00
DJObleezy
fbdade3837
Update notifications.css 2025-03-27 09:51:01 -07:00
DJObleezy
b8368cd537
Update workers.js 2025-03-27 09:50:19 -07:00
DJObleezy
8e2f912616
Update notifications.js 2025-03-27 09:50:07 -07:00
DJObleezy
44ffdba522
Update main.js 2025-03-27 09:49:55 -07:00
DJObleezy
f57392f10e
Update blocks.js 2025-03-27 09:49:41 -07:00
DJObleezy
1e85ebb48d
Update BitcoinProgressBar.js 2025-03-27 09:48:39 -07:00
DJObleezy
58397c2c9b
Update main.js 2025-03-26 11:04:09 -07:00
DJObleezy
b9993a12f1
Update state_manager.py 2025-03-26 10:32:43 -07:00
DJObleezy
7b7f386a2d
Update main.js 2025-03-26 08:16:30 -07:00
DJObleezy
dd10921534
Update notification_service.py 2025-03-26 08:02:21 -07:00
DJObleezy
65ebf2032e
Update state_manager.py 2025-03-26 08:02:03 -07:00
DJObleezy
c92c22cf04
Update notification_service.py 2025-03-25 20:44:46 -07:00
DJObleezy
b42fb2d16f
Update blocks.html 2025-03-25 13:50:10 -07:00
DJObleezy
9c952135b8
Update base.html 2025-03-25 13:49:56 -07:00
DJObleezy
d22874e975
Update dashboard.html 2025-03-25 13:49:44 -07:00
DJObleezy
f00cbb6755
Add files via upload 2025-03-25 13:34:54 -07:00
DJObleezy
6fffc50af1
Add files via upload 2025-03-25 13:34:35 -07:00
DJObleezy
0342b16b40
Add files via upload 2025-03-25 13:34:10 -07:00
DJObleezy
3566f2127c
Add files via upload 2025-03-25 13:33:54 -07:00
DJObleezy
8dca2c41a4
Update common.css 2025-03-25 13:33:26 -07:00
DJObleezy
d368f37541
Update base.html 2025-03-25 13:32:59 -07:00
DJObleezy
03fa18e361
Update main.js 2025-03-25 13:32:30 -07:00
DJObleezy
40371b2f38
Update App.py 2025-03-25 13:31:46 -07:00
DJObleezy
26e1780501
Update state_manager.py 2025-03-25 13:30:17 -07:00
DJObleezy
44c3db8512
Update blocks.js 2025-03-25 12:24:11 -07:00
DJObleezy
37f48c0c10
Update blocks.html 2025-03-25 12:23:51 -07:00
DJObleezy
ae7defaab2
Update README.md 2025-03-25 11:51:43 -07:00
DJObleezy
c4bcc65a99
Update README.md 2025-03-25 11:49:50 -07:00
DJObleezy
d19f3ee11c
Update common.css 2025-03-25 08:37:44 -07:00
DJObleezy
5e25b5a485
Update blocks.css 2025-03-25 08:34:20 -07:00
DJObleezy
fb86839587
Delete static/js/retro-refresh.js 2025-03-25 08:20:31 -07:00
DJObleezy
f5a30f6561
Update workers.js 2025-03-25 08:19:45 -07:00
DJObleezy
9d96ca334f
Update main.js 2025-03-25 08:19:26 -07:00
DJObleezy
8c5c39d435
Add files via upload 2025-03-25 08:18:59 -07:00
DJObleezy
6fcbfa806d
Update workers.css 2025-03-25 08:18:29 -07:00
DJObleezy
e2c58c25cf
Update retro-refresh.css 2025-03-25 08:18:15 -07:00
DJObleezy
111135ded6
Update dashboard.css 2025-03-25 08:17:25 -07:00
DJObleezy
6c78264c2a
Update common.css 2025-03-25 08:17:10 -07:00
DJObleezy
a861df5891
Update boot.css 2025-03-25 08:16:52 -07:00
DJObleezy
4d224d1428
Add files via upload 2025-03-25 08:16:26 -07:00
DJObleezy
25c4c50ecb
Update workers.html 2025-03-25 08:15:58 -07:00
DJObleezy
403450da70
Update error.html 2025-03-25 08:15:38 -07:00
DJObleezy
218776cfbf
Update dashboard.html 2025-03-25 08:15:07 -07:00
DJObleezy
2ce3a40a65
Update boot.html 2025-03-25 08:14:46 -07:00
DJObleezy
bfb9840d24
Add files via upload 2025-03-25 08:14:22 -07:00
DJObleezy
4a1f11dbd0
Update base.html 2025-03-25 08:13:56 -07:00
DJObleezy
3fcd56784b
Update worker_service.py 2025-03-25 08:12:35 -07:00
DJObleezy
28c9e9592b
Update state_manager.py 2025-03-25 08:12:18 -07:00
DJObleezy
0cbbe8c6a8
Update setup.py 2025-03-25 08:11:38 -07:00
DJObleezy
f34dec00a5
Update requirements.txt 2025-03-25 08:05:04 -07:00
DJObleezy
41e2da5829
Update models.py 2025-03-25 08:04:40 -07:00
DJObleezy
7fb94e1c00
Update dockerfile 2025-03-25 08:04:20 -07:00
DJObleezy
c7fbcb086d
Update data_service.py 2025-03-25 08:03:41 -07:00
DJObleezy
3053375e33
Update config.py 2025-03-25 08:03:20 -07:00
DJObleezy
c11f490bd7
Update App.py 2025-03-25 08:02:22 -07:00
DJObleezy
505ba83ccf
Update deployment_steps.md 2025-03-25 07:44:53 -07:00
DJObleezy
545b780dd9
Update project_structure.md 2025-03-25 07:44:26 -07:00
DJObleezy
e101839535
Update README.md
v.3.1
2025-03-25 07:38:25 -07:00
DJObleezy
b4ad2ecbb0
Update common.css
Added basic colors back in.
2025-03-24 00:38:39 -07:00
DJObleezy
cee7d0e4c4
Update workers.html 2025-03-23 23:52:03 -07:00
DJObleezy
64850db9be
Update base.html 2025-03-23 23:51:20 -07:00
DJObleezy
4678f0234f
Update README.md 2025-03-23 23:47:54 -07:00
DJObleezy
b3a1500a61
Add files via upload 2025-03-23 23:46:09 -07:00
DJObleezy
85c50178b1
Add files via upload 2025-03-23 23:43:52 -07:00
DJObleezy
7f9d9890b7
Delete minify.py 2025-03-23 23:43:33 -07:00
DJObleezy
29ecc7732b
Add files via upload
v0.3 Update
2025-03-23 23:39:04 -07:00
DJObleezy
9236dcbdf9
Delete requirements.txt 2025-03-23 23:37:42 -07:00
DJObleezy
113c171f86
Delete dockerfile 2025-03-23 23:37:27 -07:00
DJObleezy
b3afd73509
Delete App.py 2025-03-23 23:37:19 -07:00
DJObleezy
77a8722bbe
Delete templates directory 2025-03-23 23:37:09 -07:00
DJObleezy
bd11f1285d
Delete static directory 2025-03-23 23:37:02 -07:00
DJObleezy
a97e4fd4c6
Update README.md
Updated for v0.3.
2025-03-23 23:36:42 -07:00
40 changed files with 15621 additions and 5895 deletions

2729
App.py

File diff suppressed because it is too large Load Diff

405
README.md
View File

@ -1,152 +1,253 @@
# Ocean.xyz Bitcoin Mining Dashboard
## A Practical Monitoring Solution for Bitcoin Miners
This open-source dashboard provides comprehensive monitoring for Ocean.xyz pool miners, offering real-time data on hashrate, profitability, and worker status. Designed to be resource-efficient and user-friendly, it helps miners maintain oversight of their operations.
---
## Gallery:
![Boot Sequence](https://github.com/user-attachments/assets/8205e8c0-79ad-4780-bc50-237131373cf8)
![Main Dashboard](https://github.com/user-attachments/assets/33dafb93-38ef-4fee-aba1-3a7d38eca3c9)
![Workers Overview](https://github.com/user-attachments/assets/ae78c34c-fbdf-4186-9706-760a67eac44c)
---
## Practical Mining Intelligence
The dashboard aggregates essential metrics in one accessible interface:
- **Profitability Analysis**: Monitor daily and monthly earnings in BTC and USD
- **Worker Status**: Track online/offline status of mining equipment
- **Payout Monitoring**: View unpaid balance and estimated time to next payout
- **Network Metrics**: Stay informed of difficulty adjustments and network hashrate
- **Cost Analysis**: Calculate profit margins based on power consumption
## Key Features
### Mining Performance Metrics
- **Hashrate Visualization**: Clear graphical representation of hashrate trends
- **Financial Calculations**: Automatic conversion between BTC and USD values
- **Payout Estimation**: Projected time until minimum payout threshold is reached
- **Network Intelligence**: Current Bitcoin price, difficulty, and total network hashrate
### Worker Management
- **Equipment Overview**: Consolidated view of all mining devices
- **Status Monitoring**: Clear indicators for active and inactive devices
- **Performance Data**: Individual hashrate, temperature, and acceptance rate metrics
- **Filtering Options**: Sort and search by device type or operational status
### Thoughtful Design Elements
- **Retro Terminal Monitor**: A floating system monitor with classic design aesthetics
- **Boot Sequence**: An engaging initialization sequence on startup
- **Responsive Interface**: Adapts seamlessly to desktop and mobile devices
## Installation Options
### Standard Installation
1. Download the latest release package
2. Configure your mining parameters in `config.json`:
- Pool wallet address
- Electricity cost ($/kWh)
- System power consumption (watts)
3. Launch the application using the included startup script
4. Access the dashboard at `http://localhost:5000`
### Docker Installation
### Build and Run Manually
1. Clone the repository:
```bash
git clone https://github.com/djobleezy/ocean-mining-dashboard.git
cd ocean-mining-dashboard
```
2. Build the Docker image:
```bash
docker build -t mining-dashboard .
```
3. Run the container:
```bash
docker run -d -p 5000:5000 --name mining-dashboard mining-dashboard
```
4. Optional: Run with Redis for data persistence:
```bash
# First start a Redis container
docker run -d --name redis redis
# Then start the dashboard with Redis connection
docker run -d -p 5000:5000 --link redis --env REDIS_URL=redis://redis:6379 mining-dashboard
```
Then navigate to `http://localhost:5000` in your web browser.
## Dashboard Components
### Main Dashboard
- Interactive hashrate visualization
- Detailed profitability metrics
- Network statistics
- Current Bitcoin price
- Balance and payment information
### Workers Dashboard
![Fleet Summary](https://github.com/user-attachments/assets/3af7f79b-5679-41ae-94c7-b238934cb0b2)
- Fleet summary with aggregate statistics
- Individual worker performance metrics
- Status indicators for each device
- Flexible filtering and search functionality
### Retro Terminal Monitor
![System Monitor](https://github.com/user-attachments/assets/d5462b72-c4b2-4cef-bbc6-7f21c455e22e)
- Floating interface providing system statistics
- Progress indicator for data refresh cycles
- System uptime display
- Minimizable design for unobtrusive monitoring
- Thoughtful visual styling reminiscent of classic computer terminals
## System Requirements
The application is designed for efficient resource utilization:
- Compatible with standard desktop and laptop computers
- Modest CPU and memory requirements
- Suitable for continuous operation
- Cross-platform support for Windows, macOS, and Linux
## Troubleshooting
For optimal performance:
1. Use the refresh function if data appears outdated
2. Verify network connectivity for consistent updates
3. Restart the application after configuration changes
4. Access the health endpoint at `/api/health` for system status information
## Getting Started
1. Download the latest release
2. Configure with your mining information
3. Launch the application to begin monitoring
The dashboard requires only your Ocean.xyz mining wallet address for basic functionality.
---
## Technical Foundation
Built on Flask with Chart.js for visualization and Server-Sent Events for real-time updates, this dashboard retrieves data from Ocean.xyz and performs calculations based on current network metrics and your specified parameters.
The application prioritizes stability and efficiency for reliable long-term operation. Source code is available for review and customization.
## Acknowledgments
- Ocean.xyz mining pool for their service
- The open-source community for their contributions
- Bitcoin protocol developers
Available under the MIT License. This is an independent project not affiliated with Ocean.xyz.
# DeepSea Dashboard
## A Retro Mining Monitoring Solution
This open-source dashboard provides real-time monitoring for Ocean.xyz pool miners, offering detailed insights on hashrate, profitability, worker status, and network metrics. Designed with a retro terminal aesthetic and focused on reliability, it helps miners maintain complete oversight of their operations.
---
## Gallery:
![DeepSea Boot](https://github.com/user-attachments/assets/77222f13-1e95-48ee-a418-afd0e6b7a920)
![DeepSea Config](https://github.com/user-attachments/assets/48fcc2a6-f56e-48b9-ac61-b27e9b4a6e41)
![DeepSea Dashboard](https://github.com/user-attachments/assets/f8f3671e-907a-456a-b8c6-5d9ecd07946c)
---
## Key Features
### Real-Time Mining Metrics
- **Live Hashrate Tracking**: Monitor 60-second, 10-minute, 3-hour, and 24-hour average hashrates
- **Profitability Analysis**: View daily and monthly earnings in both BTC and USD
- **Financial Calculations**: Automatically calculate revenue, power costs, and net profit
- **Network Statistics**: Track current Bitcoin price, difficulty, and network hashrate
- **Payout Monitoring**: View unpaid balance and estimated time to next payout
- **Pool Fee Analysis**: Monitor pool fee percentages with visual indicator when optimal rates (0.9-1.3%) are detected
### Worker Management
- **Fleet Overview**: Comprehensive view of all mining devices in one interface
- **Status Monitoring**: Real-time status indicators for online and offline devices
- **Performance Data**: Individual hashrate, temperature, and acceptance rate metrics
- **Filtering Options**: Sort and search by device type or operational status
### Bitcoin Block Explorer
- **Recent Blocks**: View the latest blocks added to the blockchain
- **Block Details**: Examine transaction counts, fees, and mining pool information
- **Visual Indicators**: Track network difficulty and block discovery times
### System Resilience
- **Connection Recovery**: Automatic reconnection after network interruptions
- **Backup Polling**: Fallback to traditional polling if real-time connection fails
- **Cross-Tab Synchronization**: Data consistency across multiple browser tabs
- **Server Health Monitoring**: Built-in watchdog processes ensure reliability
- **Error Handling**: Displays a user-friendly error page (`error.html`) for unexpected issues.
### Distinctive Design Elements
- **Retro Terminal Aesthetic**: Nostalgic interface with modern functionality
- **Boot Sequence Animation**: Engaging initialization sequence on startup
- **System Monitor**: Floating status display with uptime and refresh information
- **Responsive Interface**: Adapts to desktop and mobile devices
### DeepSea Theme
- **Underwater Effects**: Light rays and digital noise create an immersive experience.
- **Retro Glitch Effects**: Subtle animations for a nostalgic feel.
- **Theme Toggle**: Switch between Bitcoin and DeepSea themes with a single click.
## Quick Start
### Installation
1. Clone the repository
```
git clone https://github.com/Djobleezy/DeepSea-Dashboard.git
cd DeepSea-Dashboard
```
2. Install dependencies:
```
pip install -r requirements.txt
```
3. Run the setup script:
```
python setup.py
```
4. Start the application:
```
python App.py
```
5. Open your browser at `http://localhost:5000`
For detailed deployment instructions with Redis persistence and Gunicorn configuration, see [deployment_steps.md](deployment_steps.md).
## Using docker-compose (with Redis)
The `docker-compose.yml` file makes it easy to deploy the dashboard and its dependencies.
### Steps to Deploy
1. **Start the services**:
Run the following command in the project root:
```
docker-compose up -d
```
2. **Access the dashboard**:
Open your browser at `http://localhost:5000`.
3. **Stop the services**:
To stop the services, run:
```
docker-compose down
```
### Customization
You can modify the following environment variables in the `docker-compose.yml` file:
- `WALLET`: Your Bitcoin wallet address.
- `POWER_COST`: Cost of power per kWh.
- `POWER_USAGE`: Power usage in watts.
- `NETWORK_FEE`: Additional fees beyond pool fees (e.g., firmware fees).
- `TIMEZONE`: Local timezone for displaying time information.
Redis data is stored in a persistent volume (`redis_data`), and application logs are saved in the `./logs` directory.
For more details, refer to the [docker-compose documentation](https://docs.docker.com/compose/).
## Dashboard Components
### Main Dashboard
- Interactive hashrate visualization with trend analysis
- Real-time profitability metrics with cost calculations
- Network statistics with difficulty and price tracking
- Payout information with estimation timing
- Visual indicators for metric changes
### Workers Dashboard
- Fleet summary with aggregate statistics
- Individual worker cards with detailed metrics
- Status indicators with color-coded alerts
- Search and filtering functionality
- Performance trend mini-charts
### Blocks Explorer
- Recent block visualization with mining details
- Transaction statistics and fee information
- Mining pool attribution
- Block details modal with comprehensive data
### System Monitor
- Floating interface providing system statistics
- Progress indicator for data refresh cycles
- System uptime display
- Real-time connection status
## System Requirements
The application is designed for efficient resource utilization:
- **Server**: Any system capable of running Python 3.9+
- **Memory**: Minimal requirements (~100MB RAM)
- **Storage**: Less than 50MB for application files
- **Database**: Optional Redis for persistent state
- **Compatible with**: Windows, macOS, and Linux
## Technical Architecture
Built with a modern stack for reliability and performance:
- **Backend**: Flask with Server-Sent Events for real-time updates
- **Frontend**: Vanilla JavaScript with Chart.js for visualization
- **Data Processing**: Concurrent API calls with smart caching
- **Resilience**: Automatic recovery mechanisms and state persistence
- **Configuration**: Environment variables and JSON-based settings
## API Endpoints
- `/api/metrics`: Provides real-time mining metrics.
- `/api/available_timezones`: Returns a list of supported timezones.
- `/api/config`: Fetches or updates the mining configuration.
- `/api/health`: Returns the health status of the application.
## Project Structure
The project follows a modular architecture with clear separation of concerns:
```
DeepSea-Dashboard/
├── App.py # Main application entry point
├── config.py # Configuration management
├── config.json # Configuration file
├── data_service.py # Service for fetching mining data
├── models.py # Data models
├── state_manager.py # Manager for persistent state
├── worker_service.py # Service for worker data management
├── notification_service.py # Service for notifications
├── minify.py # Script for minifying assets
├── setup.py # Setup script for organizing files
├── requirements.txt # Python dependencies
├── Dockerfile # Docker configuration
├── docker-compose.yml # Docker Compose configuration
├── templates/ # HTML templates
│ ├── base.html # Base template with common elements
│ ├── boot.html # Boot sequence animation
│ ├── dashboard.html # Main dashboard template
│ ├── workers.html # Workers dashboard template
│ ├── blocks.html # Bitcoin blocks template
│ ├── notifications.html # Notifications template
│ └── error.html # Error page template
├── static/ # Static assets
│ ├── css/ # CSS files
│ │ ├── common.css # Shared styles across all pages
│ │ ├── dashboard.css # Main dashboard styles
│ │ ├── workers.css # Workers page styles
│ │ ├── boot.css # Boot sequence styles
│ │ ├── blocks.css # Blocks page styles
│ │ ├── notifications.css # Notifications page styles
│ │ ├── error.css # Error page styles
│ │ ├── retro-refresh.css # Floating refresh bar styles
│ │ └── theme-toggle.css # Theme toggle styles
│ │
│ └── js/ # JavaScript files
│ ├── main.js # Main dashboard functionality
│ ├── workers.js # Workers page functionality
│ ├── blocks.js # Blocks page functionality
│ ├── notifications.js # Notifications functionality
│ ├── block-animation.js # Block mining animation
│ ├── BitcoinProgressBar.js # System monitor functionality
│ └── theme.js # Theme toggle functionality
├── deployment_steps.md # Deployment guide
├── project_structure.md # Additional structure documentation
├── LICENSE.md # License information
└── logs/ # Application logs (generated at runtime)
```
For more detailed information on the architecture and component interactions, see [project_structure.md](project_structure.md).
## Troubleshooting
For optimal performance:
1. Ensure your wallet address is correctly configured
2. Check network connectivity for consistent updates
3. Use the system monitor to verify connection status
4. Access the health endpoint at `/api/health` for diagnostics
5. For stale data issues, use the Force Refresh function
6. Use hotkey Shift+R to clear chart and Redis data (as needed, not required)
## License
Available under the MIT License. This is an independent project not affiliated with Ocean.xyz.
## Acknowledgments
- Ocean.xyz mining pool for their service
- mempool.guide
- The open-source community for their contributions
- Bitcoin protocol developers

7
config.json Normal file
View File

@ -0,0 +1,7 @@
{
"power_cost": 0.0,
"power_usage": 0.0,
"wallet": "yourwallethere",
"timezone": "America/Los_Angeles",
"network_fee": 0.0
}

96
config.py Normal file
View File

@ -0,0 +1,96 @@
"""
Configuration management module for the Bitcoin Mining Dashboard.
Responsible for loading and managing application settings.
"""
import os
import json
import logging
# Default configuration file path
CONFIG_FILE = "config.json"
def load_config():
"""
Load configuration from file or return defaults if file doesn't exist.
"""
default_config = {
"power_cost": 0.0,
"power_usage": 0.0,
"wallet": "yourwallethere",
"timezone": "America/Los_Angeles",
"network_fee": 0.0 # Add default network fee
}
if os.path.exists(CONFIG_FILE):
try:
with open(CONFIG_FILE, "r") as f:
config = json.load(f)
logging.info(f"Configuration loaded from {CONFIG_FILE}")
# Ensure network_fee is present even in existing config files
if "network_fee" not in config:
config["network_fee"] = default_config["network_fee"]
logging.info("Added missing network_fee to config with default value")
return config
except Exception as e:
logging.error(f"Error loading config: {e}")
else:
logging.warning(f"Config file {CONFIG_FILE} not found, using defaults")
return default_config
def get_timezone():
"""
Get the configured timezone with fallback to default.
Returns:
str: Timezone identifier
"""
# First check environment variable (for Docker)
import os
env_timezone = os.environ.get("TIMEZONE")
if env_timezone:
return env_timezone
# Then check config file
config = load_config()
timezone = config.get("timezone")
if timezone:
return timezone
# Default to Los Angeles
return "America/Los_Angeles"
def save_config(config):
"""
Save configuration to file.
Args:
config (dict): Configuration dictionary to save
Returns:
bool: True if save was successful, False otherwise
"""
try:
with open(CONFIG_FILE, "w") as f:
json.dump(config, f, indent=2)
logging.info(f"Configuration saved to {CONFIG_FILE}")
return True
except Exception as e:
logging.error(f"Error saving config: {e}")
return False
def get_value(key, default=None):
"""
Get a configuration value by key with fallback to default.
Args:
key (str): Configuration key to look up
default: Default value if key is not found
Returns:
Value for the key or default if not found
"""
config = load_config()
return config.get(key, default)

993
data_service.py Normal file
View File

@ -0,0 +1,993 @@
"""
Data service module for fetching and processing mining data.
"""
import logging
import re
import time
import json
from datetime import datetime, timedelta
from zoneinfo import ZoneInfo
from concurrent.futures import ThreadPoolExecutor
import requests
from bs4 import BeautifulSoup
from models import OceanData, WorkerData, convert_to_ths
from config import get_timezone
class MiningDashboardService:
"""Service for fetching and processing mining dashboard data."""
def __init__(self, power_cost, power_usage, wallet, network_fee=0.0):
"""
Initialize the mining dashboard service.
Args:
power_cost (float): Cost of power in $ per kWh
power_usage (float): Power usage in watts
wallet (str): Bitcoin wallet address for Ocean.xyz
network_fee (float): Additional network fee percentage
"""
self.power_cost = power_cost
self.power_usage = power_usage
self.wallet = wallet
self.network_fee = network_fee
self.cache = {}
self.sats_per_btc = 100_000_000
self.previous_values = {}
self.session = requests.Session()
def fetch_metrics(self):
"""
Fetch metrics from Ocean.xyz and other sources.
Returns:
dict: Mining metrics data
"""
# Add execution time tracking
start_time = time.time()
try:
with ThreadPoolExecutor(max_workers=2) as executor:
future_ocean = executor.submit(self.get_ocean_data)
future_btc = executor.submit(self.get_bitcoin_stats)
try:
ocean_data = future_ocean.result(timeout=15)
btc_stats = future_btc.result(timeout=15)
except Exception as e:
logging.error(f"Error fetching metrics concurrently: {e}")
return None
if ocean_data is None:
logging.error("Failed to retrieve Ocean data")
return None
difficulty, network_hashrate, btc_price, block_count = btc_stats
# If we failed to get network hashrate, use a reasonable default to prevent division by zero
if network_hashrate is None:
logging.warning("Using default network hashrate")
network_hashrate = 500e18 # ~500 EH/s as a reasonable fallback
# If we failed to get BTC price, use a reasonable default
if btc_price is None:
logging.warning("Using default BTC price")
btc_price = 75000 # $75,000 as a reasonable fallback
# Convert hashrates to a common unit (TH/s) for consistency
hr3 = ocean_data.hashrate_3hr or 0
hr3_unit = (ocean_data.hashrate_3hr_unit or 'th/s').lower()
local_hashrate = convert_to_ths(hr3, hr3_unit) * 1e12 # Convert to H/s for calculation
hash_proportion = local_hashrate / network_hashrate if network_hashrate else 0
block_reward = 3.125
blocks_per_day = 86400 / 600
daily_btc_gross = hash_proportion * block_reward * blocks_per_day
# Use actual pool fees instead of hardcoded values
# Get the pool fee percentage from ocean_data, default to 2.0% if not available
pool_fee_percent = ocean_data.pool_fees_percentage if ocean_data.pool_fees_percentage is not None else 2.0
# Get the network fee from the configuration (default to 0.0% if not set)
from config import load_config
config = load_config()
network_fee_percent = config.get("network_fee", 0.0)
# Calculate total fee percentage (converting from percentage to decimal)
total_fee_rate = (pool_fee_percent + network_fee_percent) / 100.0
# Calculate net BTC accounting for actual fees
daily_btc_net = daily_btc_gross * (1 - total_fee_rate)
# Log the fee calculations for transparency
logging.info(f"Earnings calculation using pool fee: {pool_fee_percent}% + network fee: {network_fee_percent}%")
logging.info(f"Total fee rate: {total_fee_rate}, Daily BTC gross: {daily_btc_gross}, Daily BTC net: {daily_btc_net}")
daily_revenue = round(daily_btc_net * btc_price, 2) if btc_price is not None else None
daily_power_cost = round((self.power_usage / 1000) * self.power_cost * 24, 2)
daily_profit_usd = round(daily_revenue - daily_power_cost, 2) if daily_revenue is not None else None
monthly_profit_usd = round(daily_profit_usd * 30, 2) if daily_profit_usd is not None else None
daily_mined_sats = int(round(daily_btc_net * self.sats_per_btc))
monthly_mined_sats = daily_mined_sats * 30
# Use default 0 for earnings if scraping returned None.
estimated_earnings_per_day = ocean_data.estimated_earnings_per_day if ocean_data.estimated_earnings_per_day is not None else 0
estimated_earnings_next_block = ocean_data.estimated_earnings_next_block if ocean_data.estimated_earnings_next_block is not None else 0
estimated_rewards_in_window = ocean_data.estimated_rewards_in_window if ocean_data.estimated_rewards_in_window is not None else 0
metrics = {
'pool_total_hashrate': ocean_data.pool_total_hashrate,
'pool_total_hashrate_unit': ocean_data.pool_total_hashrate_unit,
'hashrate_24hr': ocean_data.hashrate_24hr,
'hashrate_24hr_unit': ocean_data.hashrate_24hr_unit,
'hashrate_3hr': ocean_data.hashrate_3hr,
'hashrate_3hr_unit': ocean_data.hashrate_3hr_unit,
'hashrate_10min': ocean_data.hashrate_10min,
'hashrate_10min_unit': ocean_data.hashrate_10min_unit,
'hashrate_5min': ocean_data.hashrate_5min,
'hashrate_5min_unit': ocean_data.hashrate_5min_unit,
'hashrate_60sec': ocean_data.hashrate_60sec,
'hashrate_60sec_unit': ocean_data.hashrate_60sec_unit,
'workers_hashing': ocean_data.workers_hashing,
'btc_price': btc_price,
'block_number': block_count,
'network_hashrate': (network_hashrate / 1e18) if network_hashrate else None,
'difficulty': difficulty,
'daily_btc_gross': daily_btc_gross,
'daily_btc_net': daily_btc_net,
'pool_fee_percent': pool_fee_percent,
'network_fee_percent': network_fee_percent,
'total_fee_rate': total_fee_rate,
'estimated_earnings_per_day': estimated_earnings_per_day,
'daily_revenue': daily_revenue,
'daily_power_cost': daily_power_cost,
'daily_profit_usd': daily_profit_usd,
'monthly_profit_usd': monthly_profit_usd,
'daily_mined_sats': daily_mined_sats,
'monthly_mined_sats': monthly_mined_sats,
'estimated_earnings_next_block': estimated_earnings_next_block,
'estimated_rewards_in_window': estimated_rewards_in_window,
'unpaid_earnings': ocean_data.unpaid_earnings,
'est_time_to_payout': ocean_data.est_time_to_payout,
'last_block_height': ocean_data.last_block_height,
'last_block_time': ocean_data.last_block_time,
'total_last_share': ocean_data.total_last_share,
'blocks_found': ocean_data.blocks_found or "0",
'last_block_earnings': ocean_data.last_block_earnings,
'pool_fees_percentage': ocean_data.pool_fees_percentage,
}
metrics['estimated_earnings_per_day_sats'] = int(round(estimated_earnings_per_day * self.sats_per_btc))
metrics['estimated_earnings_next_block_sats'] = int(round(estimated_earnings_next_block * self.sats_per_btc))
metrics['estimated_rewards_in_window_sats'] = int(round(estimated_rewards_in_window * self.sats_per_btc))
# --- Add server timestamps to the response in Los Angeles Time ---
metrics["server_timestamp"] = datetime.now(ZoneInfo(get_timezone())).isoformat()
metrics["server_start_time"] = datetime.now(ZoneInfo(get_timezone())).isoformat()
# Log execution time
execution_time = time.time() - start_time
metrics["execution_time"] = execution_time
if execution_time > 10:
logging.warning(f"Metrics fetch took {execution_time:.2f} seconds")
else:
logging.info(f"Metrics fetch completed in {execution_time:.2f} seconds")
return metrics
except Exception as e:
logging.error(f"Unexpected error in fetch_metrics: {e}")
return None
def get_ocean_data(self):
"""
Get mining data from Ocean.xyz.
Returns:
OceanData: Ocean.xyz mining data
"""
base_url = "https://ocean.xyz"
stats_url = f"{base_url}/stats/{self.wallet}"
headers = {
'User-Agent': 'Mozilla/5.0',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Cache-Control': 'no-cache'
}
# Create an empty data object to populate
data = OceanData()
try:
response = self.session.get(stats_url, headers=headers, timeout=10)
if not response.ok:
logging.error(f"Error fetching ocean data: status code {response.status_code}")
return None
soup = BeautifulSoup(response.text, 'html.parser')
# Safely extract pool status information
try:
pool_status = soup.find("p", id="pool-status-item")
if pool_status:
text = pool_status.get_text(strip=True)
m_total = re.search(r'HASHRATE:\s*([\d\.]+)\s*(\w+/s)', text, re.IGNORECASE)
if m_total:
raw_val = float(m_total.group(1))
unit = m_total.group(2)
data.pool_total_hashrate = raw_val
data.pool_total_hashrate_unit = unit
span = pool_status.find("span", class_="pool-status-newline")
if span:
last_block_text = span.get_text(strip=True)
m_block = re.search(r'LAST BLOCK:\s*(\d+\s*\(.*\))', last_block_text, re.IGNORECASE)
if m_block:
full_last_block = m_block.group(1)
data.last_block = full_last_block
match = re.match(r'(\d+)\s*\((.*?)\)', full_last_block)
if match:
data.last_block_height = match.group(1)
data.last_block_time = match.group(2)
else:
data.last_block_height = full_last_block
data.last_block_time = ""
except Exception as e:
logging.error(f"Error parsing pool status: {e}")
# Parse the earnings value from the earnings table and convert to sats.
try:
earnings_table = soup.find('tbody', id='earnings-tablerows')
if earnings_table:
latest_row = earnings_table.find('tr', class_='table-row')
if latest_row:
cells = latest_row.find_all('td', class_='table-cell')
if len(cells) >= 4: # Ensure there are enough cells for earnings and pool fees
earnings_text = cells[2].get_text(strip=True)
pool_fees_text = cells[3].get_text(strip=True)
# Parse earnings and pool fees
earnings_value = earnings_text.replace('BTC', '').strip()
pool_fees_value = pool_fees_text.replace('BTC', '').strip()
try:
# Convert earnings to BTC and sats
btc_earnings = float(earnings_value)
sats = int(round(btc_earnings * 100_000_000))
data.last_block_earnings = str(sats)
# Calculate percentage lost to pool fees
btc_pool_fees = float(pool_fees_value)
percentage_lost = (btc_pool_fees / btc_earnings) * 100 if btc_earnings > 0 else 0
data.pool_fees_percentage = round(percentage_lost, 2)
except Exception as e:
logging.error(f"Error converting earnings or calculating percentage: {e}")
data.last_block_earnings = earnings_value
data.pool_fees_percentage = None
except Exception as e:
logging.error(f"Error parsing earnings data: {e}")
# Parse hashrate data from the hashrates table
try:
time_mapping = {
'24 hrs': ('hashrate_24hr', 'hashrate_24hr_unit'),
'3 hrs': ('hashrate_3hr', 'hashrate_3hr_unit'),
'10 min': ('hashrate_10min', 'hashrate_10min_unit'),
'5 min': ('hashrate_5min', 'hashrate_5min_unit'),
'60 sec': ('hashrate_60sec', 'hashrate_60sec_unit')
}
hashrate_table = soup.find('tbody', id='hashrates-tablerows')
if hashrate_table:
for row in hashrate_table.find_all('tr', class_='table-row'):
cells = row.find_all('td', class_='table-cell')
if len(cells) >= 2:
period_text = cells[0].get_text(strip=True).lower()
hashrate_str = cells[1].get_text(strip=True).lower()
try:
parts = hashrate_str.split()
hashrate_val = float(parts[0])
unit = parts[1] if len(parts) > 1 else 'th/s'
for key, (attr, unit_attr) in time_mapping.items():
if key.lower() in period_text:
setattr(data, attr, hashrate_val)
setattr(data, unit_attr, unit)
break
except Exception as e:
logging.error(f"Error parsing hashrate '{hashrate_str}': {e}")
except Exception as e:
logging.error(f"Error parsing hashrate table: {e}")
# Parse lifetime stats data
try:
lifetime_snap = soup.find('div', id='lifetimesnap-statcards')
if lifetime_snap:
for container in lifetime_snap.find_all('div', class_='blocks dashboard-container'):
label_div = container.find('div', class_='blocks-label')
if label_div:
label_text = label_div.get_text(strip=True).lower()
earnings_span = label_div.find_next('span', class_=lambda x: x != 'tooltiptext')
if earnings_span:
span_text = earnings_span.get_text(strip=True)
try:
earnings_value = float(span_text.split()[0].replace(',', ''))
if "earnings" in label_text and "day" in label_text:
data.estimated_earnings_per_day = earnings_value
except Exception:
pass
except Exception as e:
logging.error(f"Error parsing lifetime stats: {e}")
# Parse payout stats data
try:
payout_snap = soup.find('div', id='payoutsnap-statcards')
if payout_snap:
for container in payout_snap.find_all('div', class_='blocks dashboard-container'):
label_div = container.find('div', class_='blocks-label')
if label_div:
label_text = label_div.get_text(strip=True).lower()
earnings_span = label_div.find_next('span', class_=lambda x: x != 'tooltiptext')
if earnings_span:
span_text = earnings_span.get_text(strip=True)
try:
earnings_value = float(span_text.split()[0].replace(',', ''))
if "earnings" in label_text and "block" in label_text:
data.estimated_earnings_next_block = earnings_value
elif "rewards" in label_text and "window" in label_text:
data.estimated_rewards_in_window = earnings_value
except Exception:
pass
except Exception as e:
logging.error(f"Error parsing payout stats: {e}")
# Parse user stats data
try:
usersnap = soup.find('div', id='usersnap-statcards')
if usersnap:
for container in usersnap.find_all('div', class_='blocks dashboard-container'):
label_div = container.find('div', class_='blocks-label')
if label_div:
label_text = label_div.get_text(strip=True).lower()
value_span = label_div.find_next('span', class_=lambda x: x != 'tooltiptext')
if value_span:
span_text = value_span.get_text(strip=True)
if "workers currently hashing" in label_text:
try:
data.workers_hashing = int(span_text.replace(",", ""))
except Exception:
pass
elif "unpaid earnings" in label_text and "btc" in span_text.lower():
try:
data.unpaid_earnings = float(span_text.split()[0].replace(',', ''))
except Exception:
pass
elif "estimated time until minimum payout" in label_text:
data.est_time_to_payout = span_text
except Exception as e:
logging.error(f"Error parsing user stats: {e}")
# Parse blocks found data
try:
blocks_container = soup.find(lambda tag: tag.name == "div" and "blocks found" in tag.get_text(strip=True).lower())
if blocks_container:
span = blocks_container.find_next_sibling("span")
if span:
num_match = re.search(r'(\d+)', span.get_text(strip=True))
if num_match:
data.blocks_found = num_match.group(1)
except Exception as e:
logging.error(f"Error parsing blocks found: {e}")
# Parse last share time data
try:
workers_table = soup.find("tbody", id="workers-tablerows")
if workers_table:
for row in workers_table.find_all("tr", class_="table-row"):
cells = row.find_all("td")
if cells and cells[0].get_text(strip=True).lower().startswith("total"):
last_share_str = cells[2].get_text(strip=True)
try:
naive_dt = datetime.strptime(last_share_str, "%Y-%m-%d %H:%M")
utc_dt = naive_dt.replace(tzinfo=ZoneInfo("UTC"))
la_dt = utc_dt.astimezone(ZoneInfo(get_timezone()))
data.total_last_share = la_dt.strftime("%Y-%m-%d %I:%M %p")
except Exception as e:
logging.error(f"Error converting last share time '{last_share_str}': {e}")
data.total_last_share = last_share_str
break
except Exception as e:
logging.error(f"Error parsing last share time: {e}")
return data
except Exception as e:
logging.error(f"Error fetching Ocean data: {e}")
return None
def debug_dump_table(self, table_element, max_rows=3):
"""
Helper method to dump the structure of an HTML table for debugging.
Args:
table_element: BeautifulSoup element representing the table
max_rows (int): Maximum number of rows to output
"""
if not table_element:
logging.debug("Table element is None - cannot dump structure")
return
try:
rows = table_element.find_all('tr', class_='table-row')
logging.debug(f"Found {len(rows)} rows in table")
# Dump header row if present
header_row = table_element.find_parent('table').find('thead')
if header_row:
header_cells = header_row.find_all('th')
header_texts = [cell.get_text(strip=True) for cell in header_cells]
logging.debug(f"Header: {header_texts}")
# Dump a sample of the data rows
for i, row in enumerate(rows[:max_rows]):
cells = row.find_all('td', class_='table-cell')
cell_texts = [cell.get_text(strip=True) for cell in cells]
logging.debug(f"Row {i}: {cell_texts}")
# Also look at raw HTML for problematic cells
for j, cell in enumerate(cells):
logging.debug(f"Row {i}, Cell {j} HTML: {cell}")
except Exception as e:
logging.error(f"Error dumping table structure: {e}")
def fetch_url(self, url: str, timeout: int = 5):
"""
Fetch URL with error handling.
Args:
url (str): URL to fetch
timeout (int): Timeout in seconds
Returns:
Response: Request response or None if failed
"""
try:
return self.session.get(url, timeout=timeout)
except Exception as e:
logging.error(f"Error fetching {url}: {e}")
return None
def get_bitcoin_stats(self):
"""
Fetch Bitcoin network statistics with improved error handling and caching.
Returns:
tuple: (difficulty, network_hashrate, btc_price, block_count)
"""
urls = {
"difficulty": "https://blockchain.info/q/getdifficulty",
"hashrate": "https://blockchain.info/q/hashrate",
"ticker": "https://blockchain.info/ticker",
"blockcount": "https://blockchain.info/q/getblockcount"
}
# Use previous cached values as defaults if available
difficulty = self.cache.get("difficulty")
network_hashrate = self.cache.get("network_hashrate")
btc_price = self.cache.get("btc_price")
block_count = self.cache.get("block_count")
try:
with ThreadPoolExecutor(max_workers=4) as executor:
futures = {key: executor.submit(self.fetch_url, url) for key, url in urls.items()}
responses = {key: futures[key].result(timeout=5) for key in futures}
# Process each response individually with error handling
if responses["difficulty"] and responses["difficulty"].ok:
try:
difficulty = float(responses["difficulty"].text)
self.cache["difficulty"] = difficulty
except (ValueError, TypeError) as e:
logging.error(f"Error parsing difficulty: {e}")
if responses["hashrate"] and responses["hashrate"].ok:
try:
network_hashrate = float(responses["hashrate"].text) * 1e9
self.cache["network_hashrate"] = network_hashrate
except (ValueError, TypeError) as e:
logging.error(f"Error parsing network hashrate: {e}")
if responses["ticker"] and responses["ticker"].ok:
try:
ticker_data = responses["ticker"].json()
btc_price = float(ticker_data.get("USD", {}).get("last", btc_price))
self.cache["btc_price"] = btc_price
except (ValueError, TypeError, json.JSONDecodeError) as e:
logging.error(f"Error parsing BTC price: {e}")
if responses["blockcount"] and responses["blockcount"].ok:
try:
block_count = int(responses["blockcount"].text)
self.cache["block_count"] = block_count
except (ValueError, TypeError) as e:
logging.error(f"Error parsing block count: {e}")
except Exception as e:
logging.error(f"Error fetching Bitcoin stats: {e}")
return difficulty, network_hashrate, btc_price, block_count
def get_all_worker_rows(self):
"""
Iterate through wpage parameter values to collect all worker table rows.
Limited to 10 pages to balance between showing enough workers and maintaining performance.
Returns:
list: A list of BeautifulSoup row elements containing worker data.
"""
all_rows = []
page_num = 0
max_pages = 10 # Limit to 10 pages of worker data
while page_num < max_pages: # Only fetch up to max_pages
url = f"https://ocean.xyz/stats/{self.wallet}?wpage={page_num}#workers-fulltable"
logging.info(f"Fetching worker data from: {url} (page {page_num+1} of max {max_pages})")
response = self.session.get(url, timeout=15)
if not response.ok:
logging.error(f"Error fetching page {page_num}: status code {response.status_code}")
break
soup = BeautifulSoup(response.text, 'html.parser')
workers_table = soup.find('tbody', id='workers-tablerows')
if not workers_table:
logging.debug(f"No workers table found on page {page_num}")
break
rows = workers_table.find_all("tr", class_="table-row")
if not rows:
logging.debug(f"No worker rows found on page {page_num}, stopping pagination")
break
logging.info(f"Found {len(rows)} worker rows on page {page_num}")
all_rows.extend(rows)
page_num += 1
if page_num >= max_pages:
logging.info(f"Reached maximum page limit ({max_pages}). Collected {len(all_rows)} worker rows total.")
else:
logging.info(f"Completed fetching all available worker data. Collected {len(all_rows)} worker rows from {page_num} pages.")
return all_rows
def get_worker_data(self):
"""
Get worker data from Ocean.xyz using multiple parsing strategies.
Tries different approaches to handle changes in the website structure.
Validates worker names to ensure they're not status indicators.
Returns:
dict: Worker data dictionary with stats and list of workers
"""
logging.info("Attempting to get worker data from Ocean.xyz")
# First try the alternative method as it's more robust
result = self.get_worker_data_alternative()
# Check if alternative method succeeded and found workers with valid names
if result and result.get('workers') and len(result['workers']) > 0:
# Validate workers - check for invalid names
has_valid_workers = False
for worker in result['workers']:
name = worker.get('name', '').lower()
if name and name not in ['online', 'offline', 'total', 'worker', 'status']:
has_valid_workers = True
break
if has_valid_workers:
logging.info(f"Alternative worker data method successful: {len(result['workers'])} workers with valid names")
return result
else:
logging.warning("Alternative method found workers but with invalid names")
# If alternative method failed or found workers with invalid names, try the original method
logging.info("Trying original worker data method")
result = self.get_worker_data_original()
# Check if original method succeeded and found workers with valid names
if result and result.get('workers') and len(result['workers']) > 0:
# Validate workers - check for invalid names
has_valid_workers = False
for worker in result['workers']:
name = worker.get('name', '').lower()
if name and name not in ['online', 'offline', 'total', 'worker', 'status']:
has_valid_workers = True
break
if has_valid_workers:
logging.info(f"Original worker data method successful: {len(result['workers'])} workers with valid names")
return result
else:
logging.warning("Original method found workers but with invalid names")
# If both methods failed or found workers with invalid names, use fallback data
logging.warning("Both worker data fetch methods failed to get valid names, using fallback data")
# Try to get worker count from cached metrics
workers_count = 0
if hasattr(self, 'cached_metrics') and self.cached_metrics:
workers_count = self.cached_metrics.get('workers_hashing', 0)
# If no cached metrics, try to get from somewhere else
if workers_count <= 0 and result and result.get('workers_total'):
workers_count = result.get('workers_total')
# Ensure we have at least 1 worker
workers_count = max(1, workers_count)
logging.info(f"Using fallback data generation with {workers_count} workers")
return None
# Rename the original method to get_worker_data_original
def get_worker_data_original(self):
"""
Original implementation to get worker data from Ocean.xyz.
Returns:
dict: Worker data dictionary with stats and list of workers
"""
base_url = "https://ocean.xyz"
stats_url = f"{base_url}/stats/{self.wallet}"
headers = {
'User-Agent': 'Mozilla/5.0',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Cache-Control': 'no-cache'
}
try:
logging.info(f"Fetching worker data from {stats_url}")
response = self.session.get(stats_url, headers=headers, timeout=15)
if not response.ok:
logging.error(f"Error fetching ocean worker data: status code {response.status_code}")
return None
soup = BeautifulSoup(response.text, 'html.parser')
# Parse worker data from the workers table
workers = []
total_hashrate = 0
total_earnings = 0
workers_table = soup.find('tbody', id='workers-tablerows')
if not workers_table:
logging.error("Workers table not found in Ocean.xyz page")
return None
# Debug: Dump table structure to help diagnose parsing issues
self.debug_dump_table(workers_table)
# Find total worker counts
workers_online = 0
workers_offline = 0
# Iterate through worker rows in the table
for row in workers_table.find_all('tr', class_='table-row'):
cells = row.find_all('td', class_='table-cell')
# Skip rows that don't have enough cells for basic info
if len(cells) < 3:
logging.warning(f"Worker row has too few cells: {len(cells)}")
continue
try:
# Extract worker name from the first cell
name_cell = cells[0]
name_text = name_cell.get_text(strip=True)
# Skip the total row
if name_text.lower() == 'total':
logging.debug("Skipping total row")
continue
logging.debug(f"Processing worker: {name_text}")
# Create worker object with safer extraction
worker = {
"name": name_text.strip(),
"status": "offline", # Default to offline
"type": "ASIC", # Default type
"model": "Unknown",
"hashrate_60sec": 0,
"hashrate_60sec_unit": "TH/s",
"hashrate_3hr": 0,
"hashrate_3hr_unit": "TH/s",
"efficiency": 90.0, # Default efficiency
"last_share": "N/A",
"earnings": 0,
"power_consumption": 0,
"temperature": 0
}
# Parse status from second cell if available
if len(cells) > 1:
status_cell = cells[1]
status_text = status_cell.get_text(strip=True).lower()
worker["status"] = "online" if "online" in status_text else "offline"
# Update counter based on status
if worker["status"] == "online":
workers_online += 1
else:
workers_offline += 1
# Parse last share time
if len(cells) > 2:
last_share_cell = cells[2]
worker["last_share"] = last_share_cell.get_text(strip=True)
# Parse 60sec hashrate if available
if len(cells) > 3:
hashrate_60s_cell = cells[3]
hashrate_60s_text = hashrate_60s_cell.get_text(strip=True)
# Parse hashrate_60sec and unit with more robust handling
try:
parts = hashrate_60s_text.split()
if parts and len(parts) > 0:
# First part should be the number
try:
numeric_value = float(parts[0])
worker["hashrate_60sec"] = numeric_value
# Second part should be the unit if it exists
if len(parts) > 1 and 'btc' not in parts[1].lower():
worker["hashrate_60sec_unit"] = parts[1]
except ValueError:
# If we can't convert to float, it might be a non-numeric value
logging.warning(f"Could not parse 60s hashrate value: {parts[0]}")
except Exception as e:
logging.error(f"Error parsing 60s hashrate '{hashrate_60s_text}': {e}")
# Parse 3hr hashrate if available
if len(cells) > 4:
hashrate_3hr_cell = cells[4]
hashrate_3hr_text = hashrate_3hr_cell.get_text(strip=True)
# Parse hashrate_3hr and unit with more robust handling
try:
parts = hashrate_3hr_text.split()
if parts and len(parts) > 0:
# First part should be the number
try:
numeric_value = float(parts[0])
worker["hashrate_3hr"] = numeric_value
# Second part should be the unit if it exists
if len(parts) > 1 and 'btc' not in parts[1].lower():
worker["hashrate_3hr_unit"] = parts[1]
# Add to total hashrate (normalized to TH/s for consistency)
total_hashrate += convert_to_ths(worker["hashrate_3hr"], worker["hashrate_3hr_unit"])
except ValueError:
# If we can't convert to float, it might be a non-numeric value
logging.warning(f"Could not parse 3hr hashrate value: {parts[0]}")
except Exception as e:
logging.error(f"Error parsing 3hr hashrate '{hashrate_3hr_text}': {e}")
# Parse earnings if available
if len(cells) > 5:
earnings_cell = cells[5]
earnings_text = earnings_cell.get_text(strip=True)
# Parse earnings with more robust handling
try:
# Remove BTC or other text, keep only the number
earnings_value = earnings_text.replace('BTC', '').strip()
try:
worker["earnings"] = float(earnings_value)
total_earnings += worker["earnings"]
except ValueError:
logging.warning(f"Could not parse earnings value: {earnings_value}")
except Exception as e:
logging.error(f"Error parsing earnings '{earnings_text}': {e}")
# Set worker type based on name (if it can be inferred)
lower_name = worker["name"].lower()
if 'antminer' in lower_name:
worker["type"] = 'ASIC'
worker["model"] = 'Bitmain Antminer'
elif 'whatsminer' in lower_name:
worker["type"] = 'ASIC'
worker["model"] = 'MicroBT Whatsminer'
elif 'bitaxe' in lower_name or 'nerdqaxe' in lower_name:
worker["type"] = 'Bitaxe'
worker["model"] = 'BitAxe Gamma 601'
workers.append(worker)
except Exception as e:
logging.error(f"Error parsing worker row: {e}")
continue
# Get daily sats from the ocean data
daily_sats = 0
try:
# Try to get this from the payoutsnap card
payout_snap = soup.find('div', id='payoutsnap-statcards')
if payout_snap:
for container in payout_snap.find_all('div', class_='blocks dashboard-container'):
label_div = container.find('div', class_='blocks-label')
if label_div and "earnings per day" in label_div.get_text(strip=True).lower():
value_span = label_div.find_next('span')
if value_span:
value_text = value_span.get_text(strip=True)
try:
btc_per_day = float(value_text.split()[0])
daily_sats = int(btc_per_day * self.sats_per_btc)
except (ValueError, IndexError):
pass
except Exception as e:
logging.error(f"Error parsing daily sats: {e}")
# Check if we found any workers
if not workers:
logging.warning("No workers found in the table, possibly a parsing issue")
return None
# Return worker stats dictionary
result = {
'workers': workers,
'total_hashrate': total_hashrate,
'hashrate_unit': 'TH/s', # Always use TH/s for consistent display
'workers_total': len(workers),
'workers_online': workers_online,
'workers_offline': workers_offline,
'total_earnings': total_earnings,
'daily_sats': daily_sats,
'timestamp': datetime.now(ZoneInfo(get_timezone())).isoformat()
}
logging.info(f"Successfully retrieved worker data: {len(workers)} workers")
return result
except Exception as e:
logging.error(f"Error fetching Ocean worker data: {e}")
import traceback
logging.error(traceback.format_exc())
return None
def get_worker_data_alternative(self):
"""
Alternative implementation to get worker data from Ocean.xyz.
This version consolidates worker rows from all pages using the wpage parameter.
Returns:
dict: Worker data dictionary with stats and list of workers.
"""
try:
logging.info("Fetching worker data across multiple pages (alternative method)")
# Get all worker rows from every page
rows = self.get_all_worker_rows()
if not rows:
logging.error("No worker rows found across any pages")
return None
workers = []
total_hashrate = 0
total_earnings = 0
workers_online = 0
workers_offline = 0
invalid_names = ['online', 'offline', 'status', 'worker', 'total']
# Process each row from all pages
for row_idx, row in enumerate(rows):
cells = row.find_all(['td', 'th'])
if not cells or len(cells) < 3:
continue
first_cell_text = cells[0].get_text(strip=True)
if first_cell_text.lower() in invalid_names:
continue
try:
worker_name = first_cell_text or f"Worker_{row_idx+1}"
worker = {
"name": worker_name,
"status": "online", # Default assumption
"type": "ASIC",
"model": "Unknown",
"hashrate_60sec": 0,
"hashrate_60sec_unit": "TH/s",
"hashrate_3hr": 0,
"hashrate_3hr_unit": "TH/s",
"efficiency": 90.0,
"last_share": "N/A",
"earnings": 0,
"power_consumption": 0,
"temperature": 0
}
# Extract status from second cell if available
if len(cells) > 1:
status_text = cells[1].get_text(strip=True).lower()
worker["status"] = "online" if "online" in status_text else "offline"
if worker["status"] == "online":
workers_online += 1
else:
workers_offline += 1
# Parse last share from third cell if available
if len(cells) > 2:
worker["last_share"] = cells[2].get_text(strip=True)
# Parse 60sec hashrate from fourth cell if available
if len(cells) > 3:
hashrate_60s_text = cells[3].get_text(strip=True)
try:
parts = hashrate_60s_text.split()
if parts:
worker["hashrate_60sec"] = float(parts[0])
if len(parts) > 1:
worker["hashrate_60sec_unit"] = parts[1]
except ValueError:
logging.warning(f"Could not parse 60-sec hashrate: {hashrate_60s_text}")
# Parse 3hr hashrate from fifth cell if available
if len(cells) > 4:
hashrate_3hr_text = cells[4].get_text(strip=True)
try:
parts = hashrate_3hr_text.split()
if parts:
worker["hashrate_3hr"] = float(parts[0])
if len(parts) > 1:
worker["hashrate_3hr_unit"] = parts[1]
# Normalize and add to total hashrate (using your convert_to_ths helper)
total_hashrate += convert_to_ths(worker["hashrate_3hr"], worker["hashrate_3hr_unit"])
except ValueError:
logging.warning(f"Could not parse 3hr hashrate: {hashrate_3hr_text}")
# Look for earnings in any cell containing 'btc'
for cell in cells:
cell_text = cell.get_text(strip=True)
if "btc" in cell_text.lower():
try:
earnings_match = re.search(r'([\d\.]+)', cell_text)
if earnings_match:
worker["earnings"] = float(earnings_match.group(1))
total_earnings += worker["earnings"]
except Exception:
pass
# Set worker type based on name
lower_name = worker["name"].lower()
if 'antminer' in lower_name:
worker["type"] = 'ASIC'
worker["model"] = 'Bitmain Antminer'
elif 'whatsminer' in lower_name:
worker["type"] = 'ASIC'
worker["model"] = 'MicroBT Whatsminer'
elif 'bitaxe' in lower_name or 'nerdqaxe' in lower_name:
worker["type"] = 'Bitaxe'
worker["model"] = 'BitAxe Gamma 601'
if worker["name"].lower() not in invalid_names:
workers.append(worker)
except Exception as e:
logging.error(f"Error parsing worker row: {e}")
continue
if not workers:
logging.error("No valid worker data parsed")
return None
result = {
'workers': workers,
'total_hashrate': total_hashrate,
'hashrate_unit': 'TH/s',
'workers_total': len(workers),
'workers_online': workers_online,
'workers_offline': workers_offline,
'total_earnings': total_earnings,
'timestamp': datetime.now(ZoneInfo(get_timezone())).isoformat()
}
logging.info(f"Successfully retrieved {len(workers)} workers across multiple pages")
return result
except Exception as e:
logging.error(f"Error in alternative worker data fetch: {e}")
return None

375
deployment_steps.md Normal file
View File

@ -0,0 +1,375 @@
# Deployment Guide
This guide provides comprehensive instructions for deploying the Bitcoin Mining Dashboard application in various environments, from development to production.
## Prerequisites
- Python 3.9 or higher
- Redis server (optional, for persistent state and improved reliability)
- Docker and Docker Compose (optional, for containerized deployment)
- Network access to Ocean.xyz API endpoints
- Modern web browser (Chrome, Firefox, Edge recommended)
## Installation Options
### Option 1: Standard Installation (Development)
1. Clone the repository:
```bash
git clone https://github.com/Djobleezy/DeepSea-Dashboard.git
cd DeepSea-Dashboard
```
2. Create a virtual environment (recommended):
```bash
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
```
3. Install dependencies:
```bash
pip install -r requirements.txt
```
4. Run the setup script to organize files:
```bash
python setup.py
```
5. Start the application:
```bash
python App.py
```
6. Access the dashboard at `http://localhost:5000`
### Option 2: Production Deployment with Gunicorn
For better performance and reliability in production environments:
1. Follow steps 1-5 from standard installation
2. Install Gunicorn if not already installed:
```bash
pip install gunicorn
```
3. Start with Gunicorn:
```bash
gunicorn -b 0.0.0.0:5000 App:app --workers=1 --threads=12 --timeout=600 --keep-alive=5
```
> **Important**: Use only 1 worker to maintain shared state. Use threads for concurrency.
4. For a more robust setup, create a systemd service:
```bash
sudo nano /etc/systemd/system/mining-dashboard.service
```
Add the following content:
```
[Unit]
Description=Bitcoin Mining Dashboard
After=network.target
[Service]
User=your_username
WorkingDirectory=/path/to/bitcoin-mining-dashboard
ExecStart=/path/to/venv/bin/gunicorn -b 0.0.0.0:5000 App:app --workers=1 --threads=12 --timeout=600 --keep-alive=5
Restart=always
RestartSec=10
[Install]
WantedBy=multi-user.target
```
5. Enable and start the service:
```bash
sudo systemctl enable mining-dashboard
sudo systemctl start mining-dashboard
```
### Option 3: Docker Deployment
1. Build the Docker image:
```bash
docker build -t bitcoin-mining-dashboard .
```
2. Run the container:
```bash
docker run -d -p 5000:5000 \
-e WALLET=your-wallet-address \
-e POWER_COST=0.12 \
-e POWER_USAGE=3450 \
-v $(pwd)/logs:/app/logs \
--name mining-dashboard \
bitcoin-mining-dashboard
```
3. Access the dashboard at `http://localhost:5000`
### Option 4: Docker Compose with Redis Persistence
1. Create a `docker-compose.yml` file:
```yaml
version: '3'
services:
redis:
image: redis:alpine
restart: unless-stopped
volumes:
- redis_data:/data
dashboard:
build: .
restart: unless-stopped
ports:
- "5000:5000"
environment:
- REDIS_URL=redis://redis:6379
- WALLET=your-wallet-address
- POWER_COST=0.12
- POWER_USAGE=3450
volumes:
- ./logs:/app/logs
depends_on:
- redis
volumes:
redis_data:
```
2. Launch the services:
```bash
docker-compose up -d
```
3. Access the dashboard at `http://localhost:5000`
## Environment Variables
The application can be configured using environment variables:
| Variable | Description | Default |
|----------|-------------|---------|
| `REDIS_URL` | Redis connection URL for persistent state | None |
| `WALLET` | Ocean.xyz wallet address | From config.json |
| `POWER_COST` | Electricity cost per kWh | From config.json |
| `POWER_USAGE` | Power consumption in watts | From config.json |
| `FLASK_ENV` | Application environment | development |
| `LOG_LEVEL` | Logging level | INFO |
| `PORT` | Application port | 5000 |
## Reverse Proxy Configuration
For production deployments, it's recommended to use a reverse proxy like Nginx:
1. Install Nginx:
```bash
sudo apt update
sudo apt install nginx
```
2. Create a configuration file:
```bash
sudo nano /etc/nginx/sites-available/mining-dashboard
```
3. Add the following configuration:
```
server {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://localhost:5000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_buffering off;
proxy_cache off;
}
}
```
4. Create a symbolic link:
```bash
sudo ln -s /etc/nginx/sites-available/mining-dashboard /etc/nginx/sites-enabled/
```
5. Test and restart Nginx:
```bash
sudo nginx -t
sudo systemctl restart nginx
```
6. (Optional) Add SSL with Certbot:
```bash
sudo apt install certbot python3-certbot-nginx
sudo certbot --nginx -d your-domain.com
```
## Maintenance
### Logs
Logs are stored in the `logs` directory by default. Monitor these logs for errors and warnings:
```bash
tail -f logs/dashboard.log
```
Common log patterns to watch for:
- `ERROR fetching metrics` - Indicates issues with Ocean.xyz API
- `Failed to connect to Redis` - Redis connection problems
- `Scheduler stopped unexpectedly` - Background job issues
### Health Monitoring
#### Health Check Endpoint
A health check endpoint is available at `/api/health` that returns:
- Application status (healthy, degraded, unhealthy)
- Uptime information
- Memory usage
- Data freshness
- Redis connection status
- Scheduler status
Example health check command:
```bash
curl http://localhost:5000/api/health | jq
```
#### Scheduler Health
To monitor the scheduler:
```bash
curl http://localhost:5000/api/scheduler-health | jq
```
### Performance Tuning
1. **Redis Configuration**: For high-traffic deployments, tune Redis:
```
maxmemory 256mb
maxmemory-policy allkeys-lru
```
2. **Gunicorn Threads**: Adjust thread count based on CPU cores:
```
--threads=$(( 2 * $(nproc) ))
```
3. **Browser Cache Headers**: Already optimized in the application
## Troubleshooting
### Common Issues
1. **Application not updating data**:
- Check network connectivity to Ocean.xyz
- Verify scheduler health:
```bash
curl http://localhost:5000/api/scheduler-health
```
- Force a data refresh:
```bash
curl -X POST http://localhost:5000/api/force-refresh
```
2. **High memory usage**:
- Check for memory leaks in log files
- Restart the application
- Enable Redis for better state management
3. **Scheduler failures**:
- Fix the scheduler:
```bash
curl -X POST http://localhost:5000/api/fix-scheduler
```
4. **Workers not showing**:
- Verify your wallet address is correct
- Check worker data:
```bash
curl http://localhost:5000/api/workers
```
### Recovery Procedures
If the application becomes unresponsive:
1. Check the logs for error messages
2. Restart the application:
```bash
sudo systemctl restart mining-dashboard
```
3. If Redis is used and may be corrupted:
```bash
sudo systemctl restart redis
```
4. For Docker deployments:
```bash
docker-compose restart
```
## Updating
To update the application:
1. Pull the latest changes:
```bash
git pull origin main
```
2. Update dependencies:
```bash
pip install -r requirements.txt --upgrade
```
3. Run the setup script:
```bash
python setup.py
```
4. Restart the application:
```bash
sudo systemctl restart mining-dashboard
```
### Docker Update Procedure
1. Pull the latest changes:
```bash
git pull origin main
```
2. Rebuild and restart:
```bash
docker-compose build
docker-compose up -d
```
## Backup Strategy
1. **Configuration**: Regularly backup your `config.json` file
2. **Redis Data**: If using Redis, set up regular RDB snapshots
3. **Logs**: Implement log rotation and archiving
## Security Recommendations
1. **Run as Non-Root User**: Always run the application as a non-root user
2. **Firewall Configuration**: Restrict access to ports 5000 and 6379 (Redis)
3. **Redis Authentication**: Enable Redis password authentication:
```
requirepass your_strong_password
```
4. **HTTPS**: Use SSL/TLS for all production deployments
5. **Regular Updates**: Keep all dependencies updated

42
docker-compose.yml Normal file
View File

@ -0,0 +1,42 @@
version: '3'
services:
redis:
image: redis:alpine
restart: unless-stopped
volumes:
- redis_data:/data
ports:
- "6379:6379"
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 3
dashboard:
build: .
restart: unless-stopped
ports:
- "5000:5000"
environment:
- REDIS_URL=redis://redis:6379
- WALLET=35eS5Lsqw8NCjFJ8zhp9JaEmyvLDwg6XtS
- POWER_COST=0
- POWER_USAGE=0
- NETWORK_FEE=0
- TIMEZONE=America/Los_Angeles
- LOG_LEVEL=INFO
volumes:
- ./logs:/app/logs
depends_on:
redis:
condition: service_healthy
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:5000/api/health"]
interval: 30s
timeout: 10s
retries: 3
volumes:
redis_data:

View File

@ -1,4 +1,4 @@
FROM python:3.9-slim
FROM python:3.9.18-slim
WORKDIR /app
@ -8,40 +8,50 @@ RUN apt-get update && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
# Install dependencies first to leverage Docker cache.
# Install dependencies first to leverage Docker cache
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy the entire application.
COPY . .
# Copy the application files
COPY *.py .
COPY config.json .
COPY setup.py .
# Run the minifier to process HTML templates.
# Create all necessary directories in one command
RUN mkdir -p static/css static/js templates logs /app/logs
# Copy static files and templates
COPY static/css/*.css static/css/
COPY static/js/*.js static/js/
COPY templates/*.html templates/
# Run the setup script to ensure proper organization
RUN python setup.py
# Run the minifier to process HTML templates
RUN python minify.py
# Create a non-root user first.
# Create a non-root user for better security
RUN adduser --disabled-password --gecos '' appuser
# Change ownership of the /app directory so that appuser can write files.
# Change ownership of the /app directory so appuser can write files
RUN chown -R appuser:appuser /app
# Create a directory for logs with proper permissions
RUN mkdir -p /app/logs && chown -R appuser:appuser /app/logs
# Switch to non-root user
USER appuser
# Expose the application port
EXPOSE 5000
# Add environment variables for app configuration
# Set environment variables
ENV FLASK_ENV=production
ENV PYTHONUNBUFFERED=1
ENV PYTHON_UNBUFFERED=1
# Improve healthcheck reliability - use new health endpoint
# Add healthcheck
HEALTHCHECK --interval=15s --timeout=5s --start-period=30s --retries=3 \
CMD curl -f http://localhost:5000/api/health || exit 1
# Use Gunicorn as the production WSGI server with improved settings
# For shared global state, we need to keep the single worker model but optimize other parameters
# Use Gunicorn as the production WSGI server
CMD ["gunicorn", "-b", "0.0.0.0:5000", "App:app", \
"--workers=1", \
"--threads=12", \
@ -52,4 +62,4 @@ CMD ["gunicorn", "-b", "0.0.0.0:5000", "App:app", \
"--error-logfile=-", \
"--log-file=-", \
"--graceful-timeout=60", \
"--worker-tmp-dir=/dev/shm"]
"--worker-tmp-dir=/dev/shm"]

300
minify.py
View File

@ -1,76 +1,246 @@
#!/usr/bin/env python3
import os
import jsmin
import htmlmin
import logging
from pathlib import Path
# Set up logging
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
logging.basicConfig(level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__name__)
TEMPLATES_DIR = "templates"
HTML_FILES = ["index.html", "error.html"]
def minify_html_file(file_path):
"""
Minify an HTML file with error handling
"""
try:
with open(file_path, "r", encoding="utf-8") as f:
content = f.read()
# Check if file has content
if not content.strip():
logging.warning(f"File {file_path} is empty. Skipping.")
return
# Minify the content
try:
minified = htmlmin.minify(content,
remove_comments=True,
remove_empty_space=True,
remove_all_empty_space=False,
reduce_boolean_attributes=True)
# Make sure minification worked and didn't remove everything
if not minified.strip():
logging.error(f"Minification of {file_path} resulted in empty content. Using original.")
minified = content
def minify_js_files():
"""Minify JavaScript files."""
js_dir = 'static/js'
min_dir = os.path.join(js_dir, 'min')
os.makedirs(min_dir, exist_ok=True)
minified_count = 0
skipped_count = 0
for js_file in os.listdir(js_dir):
if js_file.endswith('.js') and not js_file.endswith('.min.js'):
try:
input_path = os.path.join(js_dir, js_file)
output_path = os.path.join(min_dir, js_file.replace('.js', '.min.js'))
# Write back the minified content
with open(file_path, "w", encoding="utf-8") as f:
f.write(minified)
# Skip already minified files if they're newer than source
if os.path.exists(output_path) and \
os.path.getmtime(output_path) > os.path.getmtime(input_path):
logger.info(f"Skipping {js_file} (already up to date)")
skipped_count += 1
continue
logging.info(f"Minified {file_path}")
except Exception as e:
logging.error(f"Error minifying {file_path}: {e}")
except Exception as e:
logging.error(f"Error reading file {file_path}: {e}")
with open(input_path, 'r', encoding='utf-8') as f:
js_content = f.read()
# Minify the content
minified = jsmin.jsmin(js_content)
# Write minified content
with open(output_path, 'w', encoding='utf-8') as f:
f.write(minified)
size_original = len(js_content)
size_minified = len(minified)
reduction = (1 - size_minified / size_original) * 100 if size_original > 0 else 0
logger.info(f"Minified {js_file} - Reduced by {reduction:.1f}%")
minified_count += 1
except Exception as e:
logger.error(f"Error processing {js_file}: {e}")
logger.info(f"JavaScript minification: {minified_count} files minified, {skipped_count} files skipped")
return minified_count
def ensure_templates_dir():
"""
Ensure templates directory exists
"""
if not os.path.exists(TEMPLATES_DIR):
try:
os.makedirs(TEMPLATES_DIR)
logging.info(f"Created templates directory: {TEMPLATES_DIR}")
except Exception as e:
logging.error(f"Error creating templates directory: {e}")
return False
return True
def minify_css_files():
"""Minify CSS files using simple compression techniques."""
css_dir = 'static/css'
min_dir = os.path.join(css_dir, 'min')
os.makedirs(min_dir, exist_ok=True)
minified_count = 0
skipped_count = 0
for css_file in os.listdir(css_dir):
if css_file.endswith('.css') and not css_file.endswith('.min.css'):
try:
input_path = os.path.join(css_dir, css_file)
output_path = os.path.join(min_dir, css_file.replace('.css', '.min.css'))
# Skip already minified files if they're newer than source
if os.path.exists(output_path) and \
os.path.getmtime(output_path) > os.path.getmtime(input_path):
logger.info(f"Skipping {css_file} (already up to date)")
skipped_count += 1
continue
with open(input_path, 'r', encoding='utf-8') as f:
css_content = f.read()
# Simple CSS minification using string replacements
# Remove comments
import re
css_minified = re.sub(r'/\*[\s\S]*?\*/', '', css_content)
# Remove whitespace
css_minified = re.sub(r'\s+', ' ', css_minified)
# Remove spaces around selectors
css_minified = re.sub(r'\s*{\s*', '{', css_minified)
css_minified = re.sub(r'\s*}\s*', '}', css_minified)
css_minified = re.sub(r'\s*;\s*', ';', css_minified)
css_minified = re.sub(r'\s*:\s*', ':', css_minified)
css_minified = re.sub(r'\s*,\s*', ',', css_minified)
# Remove last semicolons
css_minified = re.sub(r';}', '}', css_minified)
with open(output_path, 'w', encoding='utf-8') as f:
f.write(css_minified)
size_original = len(css_content)
size_minified = len(css_minified)
reduction = (1 - size_minified / size_original) * 100 if size_original > 0 else 0
logger.info(f"Minified {css_file} - Reduced by {reduction:.1f}%")
minified_count += 1
except Exception as e:
logger.error(f"Error processing {css_file}: {e}")
logger.info(f"CSS minification: {minified_count} files minified, {skipped_count} files skipped")
return minified_count
def minify_html_templates():
"""Minify HTML template files."""
templates_dir = 'templates'
minified_count = 0
skipped_count = 0
for html_file in os.listdir(templates_dir):
if html_file.endswith('.html'):
try:
input_path = os.path.join(templates_dir, html_file)
with open(input_path, 'r', encoding='utf-8') as f:
html_content = f.read()
# Minify HTML content while keeping important whitespace
minified = htmlmin.minify(html_content,
remove_comments=True,
remove_empty_space=True,
remove_all_empty_space=False,
reduce_boolean_attributes=True)
# Write back to the same file
with open(input_path, 'w', encoding='utf-8') as f:
f.write(minified)
size_original = len(html_content)
size_minified = len(minified)
reduction = (1 - size_minified / size_original) * 100 if size_original > 0 else 0
logger.info(f"Minified {html_file} - Reduced by {reduction:.1f}%")
minified_count += 1
except Exception as e:
logger.error(f"Error processing {html_file}: {e}")
logger.info(f"HTML minification: {minified_count} files minified, {skipped_count} files skipped")
return minified_count
def create_size_report():
"""Create a report of file sizes before and after minification."""
results = []
# Check JS files
js_dir = 'static/js'
min_dir = os.path.join(js_dir, 'min')
if os.path.exists(min_dir):
for js_file in os.listdir(js_dir):
if js_file.endswith('.js') and not js_file.endswith('.min.js'):
orig_path = os.path.join(js_dir, js_file)
min_path = os.path.join(min_dir, js_file.replace('.js', '.min.js'))
if os.path.exists(min_path):
orig_size = os.path.getsize(orig_path)
min_size = os.path.getsize(min_path)
reduction = (1 - min_size / orig_size) * 100 if orig_size > 0 else 0
results.append({
'file': js_file,
'type': 'JavaScript',
'original_size': orig_size,
'minified_size': min_size,
'reduction': reduction
})
# Check CSS files
css_dir = 'static/css'
min_dir = os.path.join(css_dir, 'min')
if os.path.exists(min_dir):
for css_file in os.listdir(css_dir):
if css_file.endswith('.css') and not css_file.endswith('.min.css'):
orig_path = os.path.join(css_dir, css_file)
min_path = os.path.join(min_dir, css_file.replace('.css', '.min.css'))
if os.path.exists(min_path):
orig_size = os.path.getsize(orig_path)
min_size = os.path.getsize(min_path)
reduction = (1 - min_size / orig_size) * 100 if orig_size > 0 else 0
results.append({
'file': css_file,
'type': 'CSS',
'original_size': orig_size,
'minified_size': min_size,
'reduction': reduction
})
# Print the report
total_orig = sum(item['original_size'] for item in results)
total_min = sum(item['minified_size'] for item in results)
total_reduction = (1 - total_min / total_orig) * 100 if total_orig > 0 else 0
logger.info("\n" + "="*50)
logger.info("MINIFICATION REPORT")
logger.info("="*50)
logger.info(f"{'File':<30} {'Type':<10} {'Original':<10} {'Minified':<10} {'Reduction'}")
logger.info("-"*70)
for item in results:
logger.info(f"{item['file']:<30} {item['type']:<10} "
f"{item['original_size']/1024:.1f}KB {item['minified_size']/1024:.1f}KB "
f"{item['reduction']:.1f}%")
logger.info("-"*70)
logger.info(f"{'TOTAL:':<30} {'':<10} {total_orig/1024:.1f}KB {total_min/1024:.1f}KB {total_reduction:.1f}%")
logger.info("="*50)
def main():
"""Main function to run minification tasks."""
import argparse
parser = argparse.ArgumentParser(description='Minify web assets')
parser.add_argument('--js', action='store_true', help='Minify JavaScript files')
parser.add_argument('--css', action='store_true', help='Minify CSS files')
parser.add_argument('--html', action='store_true', help='Minify HTML templates')
parser.add_argument('--all', action='store_true', help='Minify all assets')
parser.add_argument('--report', action='store_true', help='Generate size report only')
args = parser.parse_args()
# If no arguments, default to --all
if not (args.js or args.css or args.html or args.report):
args.all = True
if args.all or args.js:
minify_js_files()
if args.all or args.css:
minify_css_files()
if args.all or args.html:
minify_html_templates()
# Always generate the report at the end if any minification was done
if args.report or args.all or args.js or args.css:
create_size_report()
if __name__ == "__main__":
logging.info("Starting HTML minification process")
if not ensure_templates_dir():
logging.error("Templates directory does not exist and could not be created. Exiting.")
exit(1)
for filename in HTML_FILES:
file_path = os.path.join(TEMPLATES_DIR, filename)
if os.path.exists(file_path):
minify_html_file(file_path)
else:
logging.warning(f"File {file_path} not found.")
logging.info("HTML minification process completed")
main()

183
models.py Normal file
View File

@ -0,0 +1,183 @@
"""
Data models for the Bitcoin Mining Dashboard.
"""
from dataclasses import dataclass
from typing import Optional, Dict, List, Union, Any
import logging
@dataclass
class OceanData:
"""Data structure for Ocean.xyz pool mining data."""
# Keep original definitions with None default to maintain backward compatibility
pool_total_hashrate: float = None
pool_total_hashrate_unit: str = None
hashrate_24hr: float = None
hashrate_24hr_unit: str = None
hashrate_3hr: float = None
hashrate_3hr_unit: str = None
hashrate_10min: float = None
hashrate_10min_unit: str = None
hashrate_5min: float = None
hashrate_5min_unit: str = None
hashrate_60sec: float = None
hashrate_60sec_unit: str = None
estimated_earnings_per_day: float = None
estimated_earnings_next_block: float = None
estimated_rewards_in_window: float = None
workers_hashing: int = None
unpaid_earnings: float = None
est_time_to_payout: str = None
last_block: str = None
last_block_height: str = None
last_block_time: str = None
blocks_found: str = None
total_last_share: str = "N/A"
last_block_earnings: str = None
pool_fees_percentage: float = None # Added missing attribute
def get_normalized_hashrate(self, timeframe: str = "3hr") -> float:
"""
Get a normalized hashrate value in TH/s regardless of original units.
Args:
timeframe: The timeframe to get ("24hr", "3hr", "10min", "5min", "60sec")
Returns:
float: Normalized hashrate in TH/s
"""
if timeframe == "24hr" and self.hashrate_24hr is not None:
return convert_to_ths(self.hashrate_24hr, self.hashrate_24hr_unit)
elif timeframe == "3hr" and self.hashrate_3hr is not None:
return convert_to_ths(self.hashrate_3hr, self.hashrate_3hr_unit)
elif timeframe == "10min" and self.hashrate_10min is not None:
return convert_to_ths(self.hashrate_10min, self.hashrate_10min_unit)
elif timeframe == "5min" and self.hashrate_5min is not None:
return convert_to_ths(self.hashrate_5min, self.hashrate_5min_unit)
elif timeframe == "60sec" and self.hashrate_60sec is not None:
return convert_to_ths(self.hashrate_60sec, self.hashrate_60sec_unit)
return 0.0
def to_dict(self) -> Dict[str, Any]:
"""Convert the OceanData object to a dictionary."""
return {k: v for k, v in self.__dict__.items()}
@classmethod
def from_dict(cls, data: Dict[str, Any]) -> 'OceanData':
"""Create an OceanData instance from a dictionary."""
filtered_data = {}
for k, v in data.items():
if k in cls.__annotations__:
filtered_data[k] = v
return cls(**filtered_data)
@dataclass
class WorkerData:
"""Data structure for individual worker information."""
name: str = None
status: str = "offline"
type: str = "ASIC" # ASIC or Bitaxe
model: str = "Unknown"
hashrate_60sec: float = 0
hashrate_60sec_unit: str = "TH/s"
hashrate_3hr: float = 0
hashrate_3hr_unit: str = "TH/s"
efficiency: float = 0
last_share: str = "N/A"
earnings: float = 0
acceptance_rate: float = 0
power_consumption: float = 0
temperature: float = 0
def __post_init__(self):
"""
Validate worker data after initialization.
Ensures values are within acceptable ranges and formats.
"""
# Ensure hashrates are non-negative
if self.hashrate_60sec is not None and self.hashrate_60sec < 0:
self.hashrate_60sec = 0
if self.hashrate_3hr is not None and self.hashrate_3hr < 0:
self.hashrate_3hr = 0
# Ensure status is valid, but don't raise exceptions for backward compatibility
if self.status not in ["online", "offline"]:
logging.warning(f"Worker {self.name}: Invalid status '{self.status}', using 'offline'")
self.status = "offline"
# Ensure type is valid, but don't raise exceptions for backward compatibility
if self.type not in ["ASIC", "Bitaxe"]:
logging.warning(f"Worker {self.name}: Invalid type '{self.type}', using 'ASIC'")
self.type = "ASIC"
def get_normalized_hashrate(self, timeframe: str = "3hr") -> float:
"""
Get normalized hashrate in TH/s.
Args:
timeframe: The timeframe to get ("3hr" or "60sec")
Returns:
float: Normalized hashrate in TH/s
"""
if timeframe == "3hr":
return convert_to_ths(self.hashrate_3hr, self.hashrate_3hr_unit)
elif timeframe == "60sec":
return convert_to_ths(self.hashrate_60sec, self.hashrate_60sec_unit)
return 0.0
def to_dict(self) -> Dict[str, Any]:
"""Convert the WorkerData object to a dictionary."""
return {k: v for k, v in self.__dict__.items()}
@classmethod
def from_dict(cls, data: Dict[str, Any]) -> 'WorkerData':
"""Create a WorkerData instance from a dictionary."""
filtered_data = {}
for k, v in data.items():
if k in cls.__annotations__:
filtered_data[k] = v
return cls(**filtered_data)
class HashRateConversionError(Exception):
"""Exception raised for errors in hashrate unit conversion."""
pass
def convert_to_ths(value, unit):
"""
Convert any hashrate unit to TH/s equivalent.
Args:
value (float): The numerical value of the hashrate
unit (str): The unit of measurement (e.g., 'PH/s', 'EH/s', etc.)
Returns:
float: The hashrate value in TH/s
"""
if value is None or value == 0:
return 0
try:
unit = unit.lower() if unit else 'th/s'
if 'ph/s' in unit:
return value * 1000 # 1 PH/s = 1000 TH/s
elif 'eh/s' in unit:
return value * 1000000 # 1 EH/s = 1,000,000 TH/s
elif 'gh/s' in unit:
return value / 1000 # 1 TH/s = 1000 GH/s
elif 'mh/s' in unit:
return value / 1000000 # 1 TH/s = 1,000,000 MH/s
elif 'kh/s' in unit:
return value / 1000000000 # 1 TH/s = 1,000,000,000 KH/s
elif 'h/s' in unit and not any(prefix in unit for prefix in ['th/s', 'ph/s', 'eh/s', 'gh/s', 'mh/s', 'kh/s']):
return value / 1000000000000 # 1 TH/s = 1,000,000,000,000 H/s
elif 'th/s' in unit:
return value
else:
# Log unexpected unit
logging.warning(f"Unexpected hashrate unit: {unit}, defaulting to treating as TH/s")
return value
except Exception as e:
logging.error(f"Error in convert_to_ths: {e}")
return value # Return original value as fallback

548
notification_service.py Normal file
View File

@ -0,0 +1,548 @@
# notification_service.py
import logging
import json
import time
import uuid
from datetime import datetime, timedelta
from enum import Enum
from collections import deque
from typing import List, Dict, Any, Optional, Union
# Constants to replace magic values
ONE_DAY_SECONDS = 86400
DEFAULT_TARGET_HOUR = 12
SIGNIFICANT_HASHRATE_CHANGE_PERCENT = 25
NOTIFICATION_WINDOW_MINUTES = 5
class NotificationLevel(Enum):
INFO = "info"
SUCCESS = "success"
WARNING = "warning"
ERROR = "error"
class NotificationCategory(Enum):
HASHRATE = "hashrate"
BLOCK = "block"
WORKER = "worker"
EARNINGS = "earnings"
SYSTEM = "system"
class NotificationService:
"""Service for managing mining dashboard notifications."""
def __init__(self, state_manager):
"""Initialize with state manager for persistence."""
self.state_manager = state_manager
self.notifications = []
self.daily_stats_time = "00:00:00" # When to post daily stats (midnight)
self.last_daily_stats = None
self.max_notifications = 100 # Maximum number to store
self.last_block_height = None # Track the last seen block height
self.last_payout_notification_time = None # Track the last payout notification time
self.last_estimated_payout_time = None # Track the last estimated payout time
# Load existing notifications from state
self._load_notifications()
# Load last block height from state
self._load_last_block_height()
def _get_redis_value(self, key: str, default: Any = None) -> Any:
"""Generic method to retrieve values from Redis."""
try:
if hasattr(self.state_manager, 'redis_client') and self.state_manager.redis_client:
value = self.state_manager.redis_client.get(key)
if value:
return value.decode('utf-8')
return default
except Exception as e:
logging.error(f"[NotificationService] Error retrieving {key} from Redis: {e}")
return default
def _set_redis_value(self, key: str, value: Any) -> bool:
"""Generic method to set values in Redis."""
try:
if hasattr(self.state_manager, 'redis_client') and self.state_manager.redis_client:
self.state_manager.redis_client.set(key, str(value))
logging.info(f"[NotificationService] Saved {key} to Redis: {value}")
return True
return False
except Exception as e:
logging.error(f"[NotificationService] Error saving {key} to Redis: {e}")
return False
def _load_notifications(self) -> None:
"""Load notifications with enhanced error handling."""
try:
stored_notifications = self.state_manager.get_notifications()
if stored_notifications:
self.notifications = stored_notifications
logging.info(f"[NotificationService] Loaded {len(self.notifications)} notifications from storage")
else:
self.notifications = []
logging.info("[NotificationService] No notifications found in storage, starting with empty list")
except Exception as e:
logging.error(f"[NotificationService] Error loading notifications: {e}")
self.notifications = [] # Ensure we have a valid list
def _load_last_block_height(self) -> None:
"""Load last block height from persistent storage."""
try:
self.last_block_height = self._get_redis_value("last_block_height")
if self.last_block_height:
logging.info(f"[NotificationService] Loaded last block height from storage: {self.last_block_height}")
else:
logging.info("[NotificationService] No last block height found, starting with None")
except Exception as e:
logging.error(f"[NotificationService] Error loading last block height: {e}")
def _save_last_block_height(self) -> None:
"""Save last block height to persistent storage."""
if self.last_block_height:
self._set_redis_value("last_block_height", self.last_block_height)
def _save_notifications(self) -> None:
"""Save notifications with improved pruning."""
try:
# Sort by timestamp before pruning to ensure we keep the most recent
if len(self.notifications) > self.max_notifications:
self.notifications.sort(key=lambda n: n.get("timestamp", ""), reverse=True)
self.notifications = self.notifications[:self.max_notifications]
self.state_manager.save_notifications(self.notifications)
logging.info(f"[NotificationService] Saved {len(self.notifications)} notifications")
except Exception as e:
logging.error(f"[NotificationService] Error saving notifications: {e}")
def add_notification(self,
message: str,
level: NotificationLevel = NotificationLevel.INFO,
category: NotificationCategory = NotificationCategory.SYSTEM,
data: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
"""
Add a new notification.
Args:
message (str): Notification message text
level (NotificationLevel): Severity level
category (NotificationCategory): Classification category
data (dict, optional): Additional data for the notification
Returns:
dict: The created notification
"""
notification = {
"id": str(uuid.uuid4()),
"timestamp": datetime.now().isoformat(),
"message": message,
"level": level.value,
"category": category.value,
"read": False
}
if data:
notification["data"] = data
self.notifications.append(notification)
self._save_notifications()
logging.info(f"[NotificationService] Added notification: {message}")
return notification
def get_notifications(self,
limit: int = 50,
offset: int = 0,
unread_only: bool = False,
category: Optional[str] = None,
level: Optional[str] = None) -> List[Dict[str, Any]]:
"""
Get filtered notifications with optimized filtering.
Args:
limit (int): Maximum number to return
offset (int): Starting offset for pagination
unread_only (bool): Only return unread notifications
category (str): Filter by category
level (str): Filter by level
Returns:
list: Filtered notifications
"""
# Apply all filters in a single pass
filtered = [
n for n in self.notifications
if (not unread_only or not n.get("read", False)) and
(not category or n.get("category") == category) and
(not level or n.get("level") == level)
]
# Sort by timestamp (newest first)
filtered = sorted(filtered, key=lambda n: n.get("timestamp", ""), reverse=True)
# Apply pagination
return filtered[offset:offset + limit]
def get_unread_count(self) -> int:
"""Get count of unread notifications."""
return sum(1 for n in self.notifications if not n.get("read", False))
def mark_as_read(self, notification_id: Optional[str] = None) -> bool:
"""
Mark notification(s) as read.
Args:
notification_id (str, optional): ID of specific notification to mark read,
or None to mark all as read
Returns:
bool: True if successful
"""
if notification_id:
# Mark specific notification as read
for n in self.notifications:
if n.get("id") == notification_id:
n["read"] = True
logging.info(f"[NotificationService] Marked notification {notification_id} as read")
break
else:
# Mark all as read
for n in self.notifications:
n["read"] = True
logging.info(f"[NotificationService] Marked all {len(self.notifications)} notifications as read")
self._save_notifications()
return True
def delete_notification(self, notification_id: str) -> bool:
"""
Delete a specific notification.
Args:
notification_id (str): ID of notification to delete
Returns:
bool: True if successful
"""
original_count = len(self.notifications)
self.notifications = [n for n in self.notifications if n.get("id") != notification_id]
deleted = original_count - len(self.notifications)
if deleted > 0:
logging.info(f"[NotificationService] Deleted notification {notification_id}")
self._save_notifications()
return deleted > 0
def clear_notifications(self, category: Optional[str] = None, older_than_days: Optional[int] = None) -> int:
"""
Clear notifications with optimized filtering.
Args:
category (str, optional): Only clear specific category
older_than_days (int, optional): Only clear notifications older than this
Returns:
int: Number of notifications cleared
"""
original_count = len(self.notifications)
cutoff_date = None
if older_than_days:
cutoff_date = datetime.now() - timedelta(days=older_than_days)
# Apply filters in a single pass
self.notifications = [
n for n in self.notifications
if (not category or n.get("category") != category) and
(not cutoff_date or datetime.fromisoformat(n.get("timestamp", datetime.now().isoformat())) >= cutoff_date)
]
cleared_count = original_count - len(self.notifications)
if cleared_count > 0:
logging.info(f"[NotificationService] Cleared {cleared_count} notifications")
self._save_notifications()
return cleared_count
def check_and_generate_notifications(self, current_metrics: Dict[str, Any], previous_metrics: Optional[Dict[str, Any]]) -> List[Dict[str, Any]]:
"""
Check metrics and generate notifications for significant events.
Args:
current_metrics: Current system metrics
previous_metrics: Previous system metrics for comparison
Returns:
list: Newly created notifications
"""
new_notifications = []
try:
# Skip if no metrics
if not current_metrics:
logging.warning("[NotificationService] No current metrics available, skipping notification checks")
return new_notifications
# Check for block updates (using persistent storage)
last_block_height = current_metrics.get("last_block_height")
if last_block_height and last_block_height != "N/A":
if self.last_block_height is not None and self.last_block_height != last_block_height:
logging.info(f"[NotificationService] Block change detected: {self.last_block_height} -> {last_block_height}")
block_notification = self._generate_block_notification(current_metrics)
if block_notification:
new_notifications.append(block_notification)
# Always update the stored last block height when it changes
if self.last_block_height != last_block_height:
self.last_block_height = last_block_height
self._save_last_block_height()
# Regular comparison with previous metrics
if previous_metrics:
# Check for daily stats
if self._should_post_daily_stats():
stats_notification = self._generate_daily_stats(current_metrics)
if stats_notification:
new_notifications.append(stats_notification)
# Check for significant hashrate drop
hashrate_notification = self._check_hashrate_change(current_metrics, previous_metrics)
if hashrate_notification:
new_notifications.append(hashrate_notification)
# Check for earnings and payout progress
earnings_notification = self._check_earnings_progress(current_metrics, previous_metrics)
if earnings_notification:
new_notifications.append(earnings_notification)
return new_notifications
except Exception as e:
logging.error(f"[NotificationService] Error generating notifications: {e}")
error_notification = self.add_notification(
f"Error generating notifications: {str(e)}",
level=NotificationLevel.ERROR,
category=NotificationCategory.SYSTEM
)
return [error_notification]
def _should_post_daily_stats(self) -> bool:
"""Check if it's time to post daily stats with improved clarity."""
now = datetime.now()
# Only proceed if we're in the target hour and within first 5 minutes
if now.hour != DEFAULT_TARGET_HOUR or now.minute >= NOTIFICATION_WINDOW_MINUTES:
return False
# If we have a last_daily_stats timestamp, check if it's a different day
if self.last_daily_stats and now.date() <= self.last_daily_stats.date():
return False
# All conditions met, update timestamp and return True
logging.info(f"[NotificationService] Posting daily stats at {now.hour}:{now.minute}")
self.last_daily_stats = now
return True
def _generate_daily_stats(self, metrics: Dict[str, Any]) -> Optional[Dict[str, Any]]:
"""Generate daily stats notification."""
try:
if not metrics:
logging.warning("[NotificationService] No metrics available for daily stats")
return None
# Format hashrate with appropriate unit
hashrate_24hr = metrics.get("hashrate_24hr", 0)
hashrate_unit = metrics.get("hashrate_24hr_unit", "TH/s")
# Format daily earnings
daily_mined_sats = metrics.get("daily_mined_sats", 0)
daily_profit_usd = metrics.get("daily_profit_usd", 0)
# Build message
message = f"Daily Mining Summary: {hashrate_24hr} {hashrate_unit} average hashrate, {daily_mined_sats} SATS mined (${daily_profit_usd:.2f})"
# Add notification
logging.info(f"[NotificationService] Generating daily stats notification: {message}")
return self.add_notification(
message,
level=NotificationLevel.INFO,
category=NotificationCategory.HASHRATE,
data={
"hashrate": hashrate_24hr,
"unit": hashrate_unit,
"daily_sats": daily_mined_sats,
"daily_profit": daily_profit_usd
}
)
except Exception as e:
logging.error(f"[NotificationService] Error generating daily stats notification: {e}")
return None
def _generate_block_notification(self, metrics: Dict[str, Any]) -> Optional[Dict[str, Any]]:
"""Generate notification for a new block found."""
try:
last_block_height = metrics.get("last_block_height", "Unknown")
last_block_earnings = metrics.get("last_block_earnings", "0")
logging.info(f"[NotificationService] Generating block notification: height={last_block_height}, earnings={last_block_earnings}")
message = f"New block found by the pool! Block #{last_block_height}, earnings: {last_block_earnings} SATS"
return self.add_notification(
message,
level=NotificationLevel.SUCCESS,
category=NotificationCategory.BLOCK,
data={
"block_height": last_block_height,
"earnings": last_block_earnings
}
)
except Exception as e:
logging.error(f"[NotificationService] Error generating block notification: {e}")
return None
def _parse_numeric_value(self, value_str: Any) -> float:
"""Parse numeric values from strings that may include units."""
if isinstance(value_str, (int, float)):
return float(value_str)
if isinstance(value_str, str):
# Extract just the numeric part
parts = value_str.split()
try:
return float(parts[0])
except (ValueError, IndexError):
pass
return 0.0
def _check_hashrate_change(self, current: Dict[str, Any], previous: Dict[str, Any]) -> Optional[Dict[str, Any]]:
"""Check for significant hashrate changes using 10-minute average."""
try:
# Get 10min hashrate values
current_10min = current.get("hashrate_10min", 0)
previous_10min = previous.get("hashrate_10min", 0)
# Log what we're comparing
logging.debug(f"[NotificationService] Comparing 10min hashrates - current: {current_10min}, previous: {previous_10min}")
# Skip if values are missing
if not current_10min or not previous_10min:
logging.debug("[NotificationService] Skipping hashrate check - missing values")
return None
# Parse values consistently
current_value = self._parse_numeric_value(current_10min)
previous_value = self._parse_numeric_value(previous_10min)
logging.debug(f"[NotificationService] Converted 10min hashrates - current: {current_value}, previous: {previous_value}")
# Skip if previous was zero (prevents division by zero)
if previous_value == 0:
logging.debug("[NotificationService] Skipping hashrate check - previous was zero")
return None
# Calculate percentage change
percent_change = ((current_value - previous_value) / previous_value) * 100
logging.debug(f"[NotificationService] 10min hashrate change: {percent_change:.1f}%")
# Significant decrease
if percent_change <= -SIGNIFICANT_HASHRATE_CHANGE_PERCENT:
message = f"Significant 10min hashrate drop detected: {abs(percent_change):.1f}% decrease"
logging.info(f"[NotificationService] Generating hashrate notification: {message}")
return self.add_notification(
message,
level=NotificationLevel.WARNING,
category=NotificationCategory.HASHRATE,
data={
"previous": previous_value,
"current": current_value,
"change": percent_change,
"timeframe": "10min"
}
)
# Significant increase
elif percent_change >= SIGNIFICANT_HASHRATE_CHANGE_PERCENT:
message = f"10min hashrate increase detected: {percent_change:.1f}% increase"
logging.info(f"[NotificationService] Generating hashrate notification: {message}")
return self.add_notification(
message,
level=NotificationLevel.SUCCESS,
category=NotificationCategory.HASHRATE,
data={
"previous": previous_value,
"current": current_value,
"change": percent_change,
"timeframe": "10min"
}
)
return None
except Exception as e:
logging.error(f"[NotificationService] Error checking hashrate change: {e}")
return None
def _check_earnings_progress(self, current: Dict[str, Any], previous: Dict[str, Any]) -> Optional[Dict[str, Any]]:
"""Check for significant earnings progress or payout approach."""
try:
current_unpaid = self._parse_numeric_value(current.get("unpaid_earnings", "0"))
# Check if approaching payout
if current.get("est_time_to_payout"):
est_time = current.get("est_time_to_payout")
# If estimated time is a number of days
if est_time.isdigit() or (est_time[0] == '-' and est_time[1:].isdigit()):
days = int(est_time)
if 0 < days <= 1:
if self._should_send_payout_notification():
message = f"Payout approaching! Estimated within 1 day"
self.last_payout_notification_time = datetime.now()
return self.add_notification(
message,
level=NotificationLevel.SUCCESS,
category=NotificationCategory.EARNINGS,
data={"days_to_payout": days}
)
# If it says "next block"
elif "next block" in est_time.lower():
if self._should_send_payout_notification():
message = f"Payout expected with next block!"
self.last_payout_notification_time = datetime.now()
return self.add_notification(
message,
level=NotificationLevel.SUCCESS,
category=NotificationCategory.EARNINGS,
data={"payout_imminent": True}
)
# Check for payout (unpaid balance reset)
if previous.get("unpaid_earnings"):
previous_unpaid = self._parse_numeric_value(previous.get("unpaid_earnings", "0"))
# If balance significantly decreased, likely a payout occurred
if previous_unpaid > 0 and current_unpaid < previous_unpaid * 0.5:
message = f"Payout received! Unpaid balance reset from {previous_unpaid} to {current_unpaid} BTC"
return self.add_notification(
message,
level=NotificationLevel.SUCCESS,
category=NotificationCategory.EARNINGS,
data={
"previous_balance": previous_unpaid,
"current_balance": current_unpaid,
"payout_amount": previous_unpaid - current_unpaid
}
)
return None
except Exception as e:
logging.error(f"[NotificationService] Error checking earnings progress: {e}")
return None
def _should_send_payout_notification(self) -> bool:
"""Check if enough time has passed since the last payout notification."""
if self.last_payout_notification_time is None:
return True
time_since_last_notification = datetime.now() - self.last_payout_notification_time
return time_since_last_notification.total_seconds() > ONE_DAY_SECONDS

279
project_structure.md Normal file
View File

@ -0,0 +1,279 @@
# Enhanced Project Structure Documentation
This document provides a comprehensive overview of the Bitcoin Mining Dashboard project architecture, component relationships, and technical design decisions.
## Directory Structure
```
DeepSea-Dashboard/
├── App.py # Main application entry point
├── config.py # Configuration management
├── config.json # Configuration file
├── data_service.py # Service for fetching mining data
├── models.py # Data models
├── state_manager.py # Manager for persistent state
├── worker_service.py # Service for worker data management
├── notification_service.py # Service for notifications
├── minify.py # Script for minifying assets
├── setup.py # Setup script for organizing files
├── requirements.txt # Python dependencies
├── Dockerfile # Docker configuration
├── docker-compose.yml # Docker Compose configuration
├── templates/ # HTML templates
│ ├── base.html # Base template with common elements
│ ├── boot.html # Boot sequence animation
│ ├── dashboard.html # Main dashboard template
│ ├── workers.html # Workers dashboard template
│ ├── blocks.html # Bitcoin blocks template
│ ├── notifications.html # Notifications template
│ └── error.html # Error page template
├── static/ # Static assets
│ ├── css/ # CSS files
│ │ ├── common.css # Shared styles across all pages
│ │ ├── dashboard.css # Main dashboard styles
│ │ ├── workers.css # Workers page styles
│ │ ├── boot.css # Boot sequence styles
│ │ ├── blocks.css # Blocks page styles
│ │ ├── notifications.css # Notifications page styles
│ │ ├── error.css # Error page styles
│ │ ├── retro-refresh.css # Floating refresh bar styles
│ │ └── theme-toggle.css # Theme toggle styles
│ │
│ └── js/ # JavaScript files
│ ├── main.js # Main dashboard functionality
│ ├── workers.js # Workers page functionality
│ ├── blocks.js # Blocks page functionality
│ ├── notifications.js # Notifications functionality
│ ├── block-animation.js # Block mining animation
│ ├── BitcoinProgressBar.js # System monitor functionality
│ └── theme.js # Theme toggle functionality
├── deployment_steps.md # Deployment guide
├── project_structure.md # Additional structure documentation
├── LICENSE.md # License information
└── logs/ # Application logs (generated at runtime)
```
## Core Components
### Backend Services
#### App.py
The main Flask application that serves as the entry point. It:
- Initializes the application and its components
- Configures routes and middleware
- Sets up the background scheduler for data updates
- Manages Server-Sent Events (SSE) connections
- Handles error recovery and graceful shutdown
Key features:
- Custom middleware for error handling
- Connection limiting for SSE to prevent resource exhaustion
- Watchdog process for scheduler health
- Metrics caching with controlled update frequency
#### data_service.py
Service responsible for fetching data from external sources:
- Retrieves mining statistics from Ocean.xyz
- Collects Bitcoin network data (price, difficulty, hashrate)
- Calculates profitability metrics
- Handles connection issues and retries
Notable implementations:
- Concurrent API requests using ThreadPoolExecutor
- Multiple parsing strategies for resilience against HTML changes
- Intelligent caching to reduce API load
- Unit normalization for consistent display
#### worker_service.py
Service for managing worker data:
- Fetches worker statistics from Ocean.xyz
- Simulates worker data when real data is unavailable
- Provides filtering and search capabilities
- Tracks worker status and performance
Key features:
- Fallback data generation for testing or connectivity issues
- Smart worker count synchronization
- Hashrate normalization across different units
#### state_manager.py
Manager for application state and history:
- Maintains hashrate history and metrics over time
- Provides persistence via Redis (optional)
- Implements data pruning to prevent memory growth
- Records indicator arrows for value changes
Implementation details:
- Thread-safe collections with locking
- Optimized storage format for Redis
- Data compression techniques for large state objects
- Automatic recovery of critical state
### Frontend Components
#### Templates
The application uses Jinja2 templates with a retro-themed design:
- **base.html**: Defines the common layout, navigation, and includes shared assets
- **dashboard.html**: Main metrics display with hashrate chart and financial calculations
- **workers.html**: Grid layout of worker cards with filtering controls
- **blocks.html**: Bitcoin block explorer with detailed information
- **boot.html**: Animated terminal boot sequence
- **error.html**: Styled error page with technical information
#### JavaScript Modules
Client-side functionality is organized into modular JavaScript files:
- **main.js**: Dashboard functionality, real-time updates, and chart rendering
- **workers.js**: Worker grid rendering, filtering, and mini-chart creation
- **blocks.js**: Block explorer with data fetching from mempool.guide
- **block-animation.js**: Interactive block mining animation
- **BitcoinProgressBar.js**: Floating system monitor with uptime and connection status
Key client-side features:
- Real-time data updates via Server-Sent Events (SSE)
- Automatic reconnection with exponential backoff
- Cross-tab synchronization using localStorage
- Data normalization for consistent unit display
- Animated UI elements for status changes
## Architecture Overview
### Data Flow
1. **Data Acquisition**:
- `data_service.py` fetches data from Ocean.xyz and blockchain sources
- Data is normalized, converted, and enriched with calculated metrics
- Results are cached in memory
2. **State Management**:
- `state_manager.py` tracks historical data points
- Maintains arrow indicators for value changes
- Optionally persists state to Redis
3. **Background Updates**:
- Scheduler runs periodic updates (typically once per minute)
- Updates are throttled to prevent API overload
- Watchdog monitors scheduler health
4. **Real-time Distribution**:
- New data is pushed to clients via Server-Sent Events
- Clients process and render updates without page reloads
- Connection management prevents resource exhaustion
5. **Client Rendering**:
- Browser receives and processes JSON updates
- Chart.js visualizes hashrate trends
- DOM updates show changes with visual indicators
- BitcoinProgressBar shows system status
### System Resilience
The application implements multiple resilience mechanisms:
#### Server-Side Resilience
- **Scheduler Recovery**: Auto-detects and restarts failed schedulers
- **Memory Management**: Prunes old data to prevent memory growth
- **Connection Limiting**: Caps maximum concurrent SSE connections
- **Graceful Degradation**: Falls back to simpler data when sources are unavailable
- **Adaptive Parsing**: Multiple strategies to handle API and HTML changes
#### Client-Side Resilience
- **Connection Recovery**: Automatic reconnection with exponential backoff
- **Fallback Polling**: Switches to traditional AJAX if SSE fails
- **Local Storage Synchronization**: Shares data across browser tabs
- **Visibility Handling**: Optimizes updates based on page visibility
### Technical Design Decisions
#### Server-Sent Events vs WebSockets
The application uses SSE instead of WebSockets because:
- Data flow is primarily one-directional (server to client)
- SSE has better reconnection handling
- Simpler implementation without additional dependencies
- Better compatibility with proxy servers
#### Single Worker Model
The application uses a single Gunicorn worker with multiple threads because:
- Shared in-memory state is simpler than distributed state
- Reduces complexity of synchronization
- Most operations are I/O bound, making threads effective
- Typical deployments have moderate user counts
#### Optional Redis Integration
Redis usage is optional because:
- Small deployments don't require persistence
- Makes local development simpler
- Allows for flexible deployment options
#### Hashrate Normalization
All hashrates are normalized to TH/s internally because:
- Provides consistent basis for comparisons
- Simplifies trend calculations and charts
- Allows for unit conversion on display
## Component Interactions
```
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Ocean.xyz API │ │ blockchain.info │ │ mempool.guide │
└────────┬────────┘ └────────┬────────┘ └────────┬────────┘
│ │ │
▼ ▼ ▼
┌────────────────────────────────────────────────────────────────────┐
│ data_service.py │
└────────────────────────────────┬───────────────────────────────────┘
┌────────────────────────────────────────────────────────────────────┐
│ App.py │
├────────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ │
│ │ worker_service │ │ state_manager │ │ Background Jobs │ │
│ └─────────────────┘ └─────────────────┘ └─────────────────┘ │
│ │
└───────────────────────────────┬────────────────────────────────────┘
┌────────────────────────────────────────────────────────────────────┐
│ Flask Routes & SSE │
└───────────────────────────────┬────────────────────────────────────┘
┌────────────────────────────────────────────┐
│ │
▼ ▼
┌─────────────────┐ ┌─────────────────┐
│ Browser Tab 1 │ │ Browser Tab N │
└─────────────────┘ └─────────────────┘
```
## Performance Considerations
### Memory Usage
- Arrow history is pruned to prevent unbounded growth
- Older data points are stored at reduced resolution
- Regular garbage collection cycles are scheduled
- Memory usage is logged for monitoring
### Network Optimization
- Data is cached to reduce API calls
- Updates are throttled to reasonable frequencies
- SSE connections have a maximum lifetime
- Failed connections use exponential backoff
### Browser Performance
- Charts use optimized rendering with limited animation
- DOM updates are batched where possible
- Data is processed in small chunks
- CSS transitions are used for smooth animations
## Future Enhancement Areas
1. **Database Integration**: Option for SQL database for long-term metrics storage
2. **User Authentication**: Multi-user support with separate configurations
3. **Mining Pool Expansion**: Support for additional mining pools beyond Ocean.xyz
4. **Mobile App**: Dedicated mobile application with push notifications
5. **Advanced Analytics**: Profitability projections and historical analysis

View File

@ -6,4 +6,17 @@ gunicorn==22.0.0
htmlmin==0.1.12
redis==5.0.1
APScheduler==3.10.4
psutil==5.9.5
psutil==5.9.5
Werkzeug==2.3.7
Jinja2==3.1.2
itsdangerous==2.1.2
MarkupSafe==2.1.3
soupsieve==2.5
tzdata==2023.3
pytz==2023.3
tzlocal==5.0.1
urllib3==2.0.7
idna==3.4
certifi==2023.7.22
six==1.16.0
jsmin==3.0.1

520
setup.py Normal file
View File

@ -0,0 +1,520 @@
#!/usr/bin/env python3
"""
Enhanced setup script for Bitcoin Mining Dashboard.
This script prepares the project structure, installs dependencies,
verifies configuration, and provides system checks for optimal operation.
"""
import os
import sys
import shutil
import logging
import argparse
import subprocess
import json
import re
from pathlib import Path
# Configure logging with color support
try:
import colorlog
handler = colorlog.StreamHandler()
handler.setFormatter(colorlog.ColoredFormatter(
'%(log_color)s%(asctime)s - %(levelname)s - %(message)s',
log_colors={
'DEBUG': 'cyan',
'INFO': 'green',
'WARNING': 'yellow',
'ERROR': 'red',
'CRITICAL': 'red,bg_white',
}
))
logger = colorlog.getLogger()
logger.addHandler(handler)
logger.setLevel(logging.INFO)
except ImportError:
# Fallback to standard logging if colorlog is not available
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
logger = logging.getLogger()
# Directory structure to create
DIRECTORIES = [
'static/css',
'static/js',
'static/js/min', # For minified JS files
'templates',
'logs',
'data' # For temporary data storage
]
# Files to move to their correct locations
FILE_MAPPINGS = {
# CSS files
'common.css': 'static/css/common.css',
'dashboard.css': 'static/css/dashboard.css',
'workers.css': 'static/css/workers.css',
'boot.css': 'static/css/boot.css',
'error.css': 'static/css/error.css',
'retro-refresh.css': 'static/css/retro-refresh.css',
'blocks.css': 'static/css/blocks.css',
'notifications.css': 'static/css/notifications.css',
'theme-toggle.css': 'static/css/theme-toggle.css', # Added theme-toggle.css
# JS files
'main.js': 'static/js/main.js',
'workers.js': 'static/js/workers.js',
'blocks.js': 'static/js/blocks.js',
'BitcoinProgressBar.js': 'static/js/BitcoinProgressBar.js',
'notifications.js': 'static/js/notifications.js',
'theme.js': 'static/js/theme.js', # Added theme.js
# Template files
'base.html': 'templates/base.html',
'dashboard.html': 'templates/dashboard.html',
'workers.html': 'templates/workers.html',
'boot.html': 'templates/boot.html',
'error.html': 'templates/error.html',
'blocks.html': 'templates/blocks.html',
'notifications.html': 'templates/notifications.html',
}
# Default configuration
DEFAULT_CONFIG = {
"power_cost": 0.0,
"power_usage": 0.0,
"wallet": "yourwallethere",
"timezone": "America/Los_Angeles", # Added default timezone
"network_fee": 0.0 # Added default network fee
}
def parse_arguments():
"""Parse command line arguments."""
parser = argparse.ArgumentParser(description='Setup the Bitcoin Mining Dashboard')
parser.add_argument('--debug', action='store_true', help='Enable debug logging')
parser.add_argument('--wallet', type=str, help='Set your Ocean.xyz wallet address')
parser.add_argument('--power-cost', type=float, help='Set your electricity cost per kWh')
parser.add_argument('--power-usage', type=float, help='Set your power consumption in watts')
parser.add_argument('--network-fee', type=float, help='Set your network fee percentage') # Added network fee parameter
parser.add_argument('--timezone', type=str, help='Set your timezone (e.g., America/Los_Angeles)') # Added timezone parameter
parser.add_argument('--skip-checks', action='store_true', help='Skip dependency checks')
parser.add_argument('--force', action='store_true', help='Force file overwrite')
parser.add_argument('--config', type=str, help='Path to custom config.json')
parser.add_argument('--minify', action='store_true', help='Minify JavaScript files')
parser.add_argument('--theme', choices=['bitcoin', 'deepsea'], help='Set the default UI theme') # Added theme parameter
return parser.parse_args()
def create_directory_structure():
"""Create the necessary directory structure."""
logger.info("Creating directory structure...")
success = True
for directory in DIRECTORIES:
try:
os.makedirs(directory, exist_ok=True)
logger.debug(f"Created directory: {directory}")
except Exception as e:
logger.error(f"Failed to create directory {directory}: {str(e)}")
success = False
if success:
logger.info("✓ Directory structure created successfully")
else:
logger.warning("⚠ Some directories could not be created")
return success
def move_files(force=False):
"""
Move files to their correct locations.
Args:
force (bool): Force overwriting of existing files
"""
logger.info("Moving files to their correct locations...")
success = True
moved_count = 0
skipped_count = 0
missing_count = 0
for source, destination in FILE_MAPPINGS.items():
if os.path.exists(source):
# Create the directory if it doesn't exist
os.makedirs(os.path.dirname(destination), exist_ok=True)
# Check if destination exists and handle according to force flag
if os.path.exists(destination) and not force:
logger.debug(f"Skipped {source} (destination already exists)")
skipped_count += 1
continue
try:
# Copy the file to its destination
shutil.copy2(source, destination)
logger.debug(f"Moved {source} to {destination}")
moved_count += 1
except Exception as e:
logger.error(f"Failed to copy {source} to {destination}: {str(e)}")
success = False
else:
logger.warning(f"Source file not found: {source}")
missing_count += 1
if success:
logger.info(f"✓ File movement completed: {moved_count} moved, {skipped_count} skipped, {missing_count} missing")
else:
logger.warning("⚠ Some files could not be moved")
return success
def minify_js_files():
"""Minify JavaScript files."""
logger.info("Minifying JavaScript files...")
try:
import jsmin
except ImportError:
logger.error("jsmin package not found. Installing...")
try:
subprocess.run([sys.executable, "-m", "pip", "install", "jsmin"], check=True)
import jsmin
logger.info("✓ jsmin package installed successfully")
except Exception as e:
logger.error(f"Failed to install jsmin: {str(e)}")
logger.error("Please run: pip install jsmin")
return False
js_dir = 'static/js'
min_dir = os.path.join(js_dir, 'min')
os.makedirs(min_dir, exist_ok=True)
minified_count = 0
for js_file in os.listdir(js_dir):
if js_file.endswith('.js') and not js_file.endswith('.min.js'):
input_path = os.path.join(js_dir, js_file)
output_path = os.path.join(min_dir, js_file.replace('.js', '.min.js'))
try:
with open(input_path, 'r') as f:
js_content = f.read()
# Minify the content
minified = jsmin.jsmin(js_content)
# Write minified content
with open(output_path, 'w') as f:
f.write(minified)
minified_count += 1
logger.debug(f"Minified {js_file}")
except Exception as e:
logger.error(f"Failed to minify {js_file}: {str(e)}")
logger.info(f"✓ JavaScript minification completed: {minified_count} files processed")
return True
def validate_wallet_address(wallet):
"""
Validate Bitcoin wallet address format.
Args:
wallet (str): Bitcoin wallet address
Returns:
bool: True if valid, False otherwise
"""
# Basic validation patterns for different Bitcoin address formats
patterns = [
r'^1[a-km-zA-HJ-NP-Z1-9]{25,34}$', # Legacy
r'^3[a-km-zA-HJ-NP-Z1-9]{25,34}$', # P2SH
r'^bc1[a-zA-Z0-9]{39,59}$', # Bech32
r'^bc1p[a-zA-Z0-9]{39,59}$', # Taproot
r'^bc1p[a-z0-9]{73,107}$' # Longform Taproot
]
# Check if the wallet matches any of the patterns
for pattern in patterns:
if re.match(pattern, wallet):
return True
return False
def create_config(args):
"""
Create or update config.json file.
Args:
args: Command line arguments
"""
config_file = args.config if args.config else 'config.json'
config = DEFAULT_CONFIG.copy()
# Load existing config if available
if os.path.exists(config_file):
try:
with open(config_file, 'r') as f:
existing_config = json.load(f)
config.update(existing_config)
logger.info(f"Loaded existing configuration from {config_file}")
except json.JSONDecodeError:
logger.warning(f"Invalid JSON in {config_file}, using default configuration")
except Exception as e:
logger.error(f"Error reading {config_file}: {str(e)}")
# Update config from command line arguments
if args.wallet:
if validate_wallet_address(args.wallet):
config["wallet"] = args.wallet
else:
logger.warning(f"Invalid wallet address format: {args.wallet}")
logger.warning("Using default or existing wallet address")
if args.power_cost is not None:
if args.power_cost >= 0:
config["power_cost"] = args.power_cost
else:
logger.warning("Power cost cannot be negative, using default or existing value")
if args.power_usage is not None:
if args.power_usage >= 0:
config["power_usage"] = args.power_usage
else:
logger.warning("Power usage cannot be negative, using default or existing value")
# Update config from command line arguments
if args.timezone:
config["timezone"] = args.timezone
if args.network_fee is not None:
if args.network_fee >= 0:
config["network_fee"] = args.network_fee
else:
logger.warning("Network fee cannot be negative, using default or existing value")
if args.theme:
config["theme"] = args.theme
# Save the configuration
try:
with open(config_file, 'w') as f:
json.dump(config, f, indent=2, sort_keys=True)
logger.info(f"✓ Configuration saved to {config_file}")
except Exception as e:
logger.error(f"Failed to save configuration: {str(e)}")
return False
# Print current configuration
logger.info("Current configuration:")
logger.info(f" ├── Wallet address: {config['wallet']}")
logger.info(f" ├── Power cost: ${config['power_cost']} per kWh")
logger.info(f" ├── Power usage: {config['power_usage']} watts")
logger.info(f" ├── Network fee: {config['network_fee']}%")
logger.info(f" └── Timezone: {config['timezone']}")
return True
def check_dependencies(skip=False):
"""
Check if required Python dependencies are installed.
Args:
skip (bool): Skip the dependency check
"""
if skip:
logger.info("Skipping dependency check")
return True
logger.info("Checking dependencies...")
try:
# Check if pip is available
subprocess.run([sys.executable, "-m", "pip", "--version"],
check=True, capture_output=True, text=True)
except Exception as e:
logger.error(f"Pip is not available: {str(e)}")
logger.error("Please install pip before continuing")
return False
# Check if requirements.txt exists
if not os.path.exists('requirements.txt'):
logger.error("requirements.txt not found")
return False
# Check currently installed packages
try:
result = subprocess.run(
[sys.executable, "-m", "pip", "freeze"],
check=True, capture_output=True, text=True
)
installed_output = result.stdout
installed_packages = {
line.split('==')[0].lower(): line.split('==')[1] if '==' in line else ''
for line in installed_output.splitlines()
}
except Exception as e:
logger.error(f"Failed to check installed packages: {str(e)}")
installed_packages = {}
# Read requirements
try:
with open('requirements.txt', 'r') as f:
requirements = f.read().splitlines()
except Exception as e:
logger.error(f"Failed to read requirements.txt: {str(e)}")
return False
# Check each requirement
missing_packages = []
for req in requirements:
if req and not req.startswith('#'):
package = req.split('==')[0].lower()
if package not in installed_packages:
missing_packages.append(req)
if missing_packages:
logger.warning(f"Missing {len(missing_packages)} required packages")
logger.info("Installing missing packages...")
try:
subprocess.run(
[sys.executable, "-m", "pip", "install", "-r", "requirements.txt"],
check=True, capture_output=True, text=True
)
logger.info("✓ Dependencies installed successfully")
except Exception as e:
logger.error(f"Failed to install dependencies: {str(e)}")
logger.error("Please run: pip install -r requirements.txt")
return False
else:
logger.info("✓ All required packages are installed")
return True
def check_redis():
"""Check if Redis is available."""
logger.info("Checking Redis availability...")
redis_url = os.environ.get("REDIS_URL")
if not redis_url:
logger.info("⚠ Redis URL not configured (REDIS_URL environment variable not set)")
logger.info(" └── The dashboard will run without persistent state")
logger.info(" └── Set REDIS_URL for better reliability")
return True
try:
import redis
client = redis.Redis.from_url(redis_url)
client.ping()
logger.info(f"✓ Successfully connected to Redis at {redis_url}")
return True
except ImportError:
logger.warning("Redis Python package not installed")
logger.info(" └── Run: pip install redis")
return False
except Exception as e:
logger.warning(f"Failed to connect to Redis: {str(e)}")
logger.info(f" └── Check that Redis is running and accessible at {redis_url}")
return False
def perform_system_checks():
"""Perform system checks and provide recommendations."""
logger.info("Performing system checks...")
# Check Python version
python_version = f"{sys.version_info.major}.{sys.version_info.minor}.{sys.version_info.micro}"
if sys.version_info.major < 3 or (sys.version_info.major == 3 and sys.version_info.minor < 9):
logger.warning(f"⚠ Python version {python_version} is below recommended (3.9+)")
else:
logger.info(f"✓ Python version {python_version} is compatible")
# Check available memory
try:
import psutil
memory = psutil.virtual_memory()
memory_gb = memory.total / (1024**3)
if memory_gb < 1:
logger.warning(f"⚠ Low system memory: {memory_gb:.2f} GB (recommended: 1+ GB)")
else:
logger.info(f"✓ System memory: {memory_gb:.2f} GB")
except ImportError:
logger.debug("psutil not available, skipping memory check")
# Check write permissions
log_dir = 'logs'
try:
test_file = os.path.join(log_dir, 'test_write.tmp')
with open(test_file, 'w') as f:
f.write('test')
os.remove(test_file)
logger.info(f"✓ Write permissions for logs directory")
except Exception as e:
logger.warning(f"⚠ Cannot write to logs directory: {str(e)}")
# Check port availability
port = 5000 # Default port
try:
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('localhost', port))
s.close()
logger.info(f"✓ Port {port} is available")
except Exception:
logger.warning(f"⚠ Port {port} is already in use")
logger.info("System checks completed")
def main():
"""Main setup function."""
args = parse_arguments()
# Set logging level
if args.debug:
logger.setLevel(logging.DEBUG)
logger.debug("Debug logging enabled")
logger.info("=== Bitcoin Mining Dashboard Setup ===")
# Check dependencies
if not check_dependencies(args.skip_checks):
logger.error("Dependency check failed. Please install required packages and retry.")
return 1
# Create directory structure
if not create_directory_structure():
logger.error("Failed to create directory structure.")
return 1
# Move files to their correct locations
if not move_files(args.force):
logger.warning("Some files could not be moved, but continuing...")
# Create or update configuration
if not create_config(args):
logger.error("Failed to create configuration file.")
return 1
# Minify JavaScript files if requested
if args.minify:
if not minify_js_files():
logger.warning("JavaScript minification failed, but continuing...")
# Check Redis if available
check_redis()
# Perform system checks
if not args.skip_checks:
perform_system_checks()
logger.info("=== Setup completed successfully ===")
logger.info("")
logger.info("Next steps:")
logger.info("1. Verify configuration in config.json")
logger.info("2. Start the application with: python App.py")
logger.info("3. Access the dashboard at: http://localhost:5000")
return 0
if __name__ == "__main__":
exit_code = main()
sys.exit(exit_code)

469
state_manager.py Normal file
View File

@ -0,0 +1,469 @@
"""
State manager module for handling persistent state and history.
"""
import logging
import json
import time
import gc
import threading
import redis
from config import get_timezone
# Global variables for arrow history, legacy hashrate history, and a log of full metrics snapshots.
arrow_history = {} # stored per second
hashrate_history = []
metrics_log = []
# Limits for data collections to prevent memory growth
MAX_HISTORY_ENTRIES = 180 # 3 hours worth at 1 min intervals
# Lock for thread safety
state_lock = threading.Lock()
class StateManager:
"""Manager for persistent state and history data."""
def __init__(self, redis_url=None):
"""
Initialize the state manager.
Args:
redis_url (str, optional): Redis URL for persistent storage
"""
self.redis_client = self._connect_to_redis(redis_url) if redis_url else None
self.STATE_KEY = "graph_state"
self.last_save_time = 0
# Load state if available
self.load_graph_state()
def _connect_to_redis(self, redis_url):
"""
Connect to Redis with retry logic.
Args:
redis_url (str): Redis URL
Returns:
redis.Redis: Redis client or None if connection failed
"""
if not redis_url:
logging.info("Redis URL not configured, using in-memory state only.")
return None
retry_count = 0
max_retries = 3
while retry_count < max_retries:
try:
client = redis.Redis.from_url(redis_url)
client.ping() # Test the connection
logging.info(f"Connected to Redis at {redis_url}")
return client
except Exception as e:
retry_count += 1
if retry_count < max_retries:
logging.warning(f"Redis connection attempt {retry_count} failed: {e}. Retrying...")
time.sleep(1) # Wait before retrying
else:
logging.error(f"Could not connect to Redis after {max_retries} attempts: {e}")
return None
def load_graph_state(self):
"""Load graph state from Redis with support for the optimized format."""
global arrow_history, hashrate_history, metrics_log
if not self.redis_client:
logging.info("Redis not available, using in-memory state.")
return
try:
# Check version to handle format changes
version = self.redis_client.get(f"{self.STATE_KEY}_version")
version = version.decode('utf-8') if version else "1.0"
state_json = self.redis_client.get(self.STATE_KEY)
if state_json:
state = json.loads(state_json)
# Handle different versions of the data format
if version == "2.0": # Optimized format
# Restore arrow_history
compact_arrow_history = state.get("arrow_history", {})
for key, values in compact_arrow_history.items():
arrow_history[key] = [
{"time": entry.get("t", ""),
"value": entry.get("v", 0),
"arrow": entry.get("a", "")} # Use saved arrow value
for entry in values
]
# Restore hashrate_history
hashrate_history = state.get("hashrate_history", [])
# Restore metrics_log
compact_metrics_log = state.get("metrics_log", [])
metrics_log = []
for entry in compact_metrics_log:
metrics_log.append({
"timestamp": entry.get("ts", ""),
"metrics": entry.get("m", {})
})
else: # Original format
arrow_history = state.get("arrow_history", {})
hashrate_history = state.get("hashrate_history", [])
metrics_log = state.get("metrics_log", [])
logging.info(f"Loaded graph state from Redis (format version {version}).")
else:
logging.info("No previous graph state found in Redis.")
except Exception as e:
logging.error(f"Error loading graph state from Redis: {e}")
def save_graph_state(self):
"""Save graph state to Redis with optimized frequency, pruning, and data reduction."""
if not self.redis_client:
logging.info("Redis not available, skipping state save.")
return
# Check if we've saved recently to avoid too frequent saves
# Only save at most once every 5 minutes
current_time = time.time()
if hasattr(self, 'last_save_time') and current_time - self.last_save_time < 300: # 300 seconds = 5 minutes
logging.debug("Skipping Redis save - last save was less than 5 minutes ago")
return
# Update the last save time
self.last_save_time = current_time
# Prune data first to reduce volume
self.prune_old_data()
# Create compact versions of the data structures for Redis storage
try:
# 1. Create compact arrow_history with minimal data
compact_arrow_history = {}
for key, values in arrow_history.items():
if isinstance(values, list) and values:
# Only store recent history (last 2 hours)
recent_values = values[-180:] if len(values) > 180 else values
# Use shorter field names and preserve arrow directions
compact_arrow_history[key] = [
{"t": entry["time"], "v": entry["value"], "a": entry["arrow"]}
for entry in recent_values
]
# 2. Only keep essential hashrate_history
compact_hashrate_history = hashrate_history[-60:] if len(hashrate_history) > 60 else hashrate_history
# 3. Only keep recent metrics_log entries (last 30 minutes)
# This is typically the largest data structure
compact_metrics_log = []
if metrics_log:
# Keep only last 30 entries (30 minutes assuming 1-minute updates)
recent_logs = metrics_log[-30:]
for entry in recent_logs:
# Only keep necessary fields from each metrics entry
if "metrics" in entry and "timestamp" in entry:
metrics_copy = {}
original_metrics = entry["metrics"]
# Only copy the most important metrics for historical tracking
essential_keys = [
"hashrate_60sec", "hashrate_24hr", "btc_price",
"workers_hashing", "unpaid_earnings", "difficulty",
"network_hashrate", "daily_profit_usd"
]
for key in essential_keys:
if key in original_metrics:
metrics_copy[key] = original_metrics[key]
# Skip arrow_history within metrics as we already stored it separately
compact_metrics_log.append({
"ts": entry["timestamp"],
"m": metrics_copy
})
# Create the final state object
state = {
"arrow_history": compact_arrow_history,
"hashrate_history": compact_hashrate_history,
"metrics_log": compact_metrics_log
}
# Convert to JSON once to reuse and measure size
state_json = json.dumps(state)
data_size_kb = len(state_json) / 1024
# Log data size for monitoring
logging.info(f"Saving graph state to Redis: {data_size_kb:.2f} KB (optimized format)")
# Only save if data size is reasonable (adjust threshold as needed)
if data_size_kb > 2000: # 2MB warning threshold (reduced from 5MB)
logging.warning(f"Redis save data size is still large: {data_size_kb:.2f} KB")
# Store version info to handle future format changes
self.redis_client.set(f"{self.STATE_KEY}_version", "2.0")
self.redis_client.set(self.STATE_KEY, state_json)
logging.info(f"Successfully saved graph state to Redis ({data_size_kb:.2f} KB)")
except Exception as e:
logging.error(f"Error saving graph state to Redis: {e}")
def prune_old_data(self):
"""Remove old data to prevent memory growth with optimized strategy."""
global arrow_history, metrics_log
with state_lock:
# Prune arrow_history with more sophisticated approach
for key in arrow_history:
if isinstance(arrow_history[key], list):
if len(arrow_history[key]) > MAX_HISTORY_ENTRIES:
# For most recent data (last hour) - keep every point
recent_data = arrow_history[key][-60:]
# For older data, reduce resolution by keeping every other point
older_data = arrow_history[key][:-60]
if len(older_data) > 0:
sparse_older_data = [older_data[i] for i in range(0, len(older_data), 2)]
arrow_history[key] = sparse_older_data + recent_data
else:
arrow_history[key] = recent_data
logging.info(f"Pruned {key} history from {len(arrow_history[key])} to {len(sparse_older_data + recent_data) if older_data else len(recent_data)} entries")
# Prune metrics_log more aggressively
if len(metrics_log) > MAX_HISTORY_ENTRIES:
# Keep most recent entries at full resolution
recent_logs = metrics_log[-60:]
# Reduce resolution of older entries
older_logs = metrics_log[:-60]
if len(older_logs) > 0:
sparse_older_logs = [older_logs[i] for i in range(0, len(older_logs), 3)] # Keep every 3rd entry
metrics_log = sparse_older_logs + recent_logs
logging.info(f"Pruned metrics log from {len(metrics_log)} to {len(sparse_older_logs + recent_logs)} entries")
# Free memory more aggressively
gc.collect()
def persist_critical_state(self, cached_metrics, scheduler_last_successful_run, last_metrics_update_time):
"""
Store critical state in Redis for recovery after worker restarts.
Args:
cached_metrics (dict): Current metrics
scheduler_last_successful_run (float): Timestamp of last successful scheduler run
last_metrics_update_time (float): Timestamp of last metrics update
"""
if not self.redis_client:
return
try:
# Only persist if we have valid data
if cached_metrics and cached_metrics.get("server_timestamp"):
state = {
"cached_metrics_timestamp": cached_metrics.get("server_timestamp"),
"last_successful_run": scheduler_last_successful_run,
"last_update_time": last_metrics_update_time
}
self.redis_client.set("critical_state", json.dumps(state))
logging.info(f"Persisted critical state to Redis, timestamp: {cached_metrics.get('server_timestamp')}")
except Exception as e:
logging.error(f"Error persisting critical state: {e}")
def load_critical_state(self):
"""
Recover critical state variables after a worker restart.
Returns:
tuple: (last_successful_run, last_update_time)
"""
if not self.redis_client:
return None, None
try:
state_json = self.redis_client.get("critical_state")
if state_json:
state = json.loads(state_json.decode('utf-8'))
last_successful_run = state.get("last_successful_run")
last_update_time = state.get("last_update_time")
logging.info(f"Loaded critical state from Redis, last run: {last_successful_run}")
# We don't restore cached_metrics itself, as we'll fetch fresh data
# Just note that we have state to recover from
logging.info(f"Last metrics timestamp from Redis: {state.get('cached_metrics_timestamp')}")
return last_successful_run, last_update_time
except Exception as e:
logging.error(f"Error loading critical state: {e}")
return None, None
def update_metrics_history(self, metrics):
"""
Update history collections with new metrics data.
Args:
metrics (dict): New metrics data
"""
global arrow_history, hashrate_history, metrics_log
# Skip if metrics is None
if not metrics:
return
arrow_keys = [
"pool_total_hashrate", "hashrate_24hr", "hashrate_3hr", "hashrate_10min",
"hashrate_60sec", "block_number", "btc_price", "network_hashrate",
"difficulty", "daily_revenue", "daily_power_cost", "daily_profit_usd",
"monthly_profit_usd", "daily_mined_sats", "monthly_mined_sats", "unpaid_earnings",
"estimated_earnings_per_day_sats", "estimated_earnings_next_block_sats", "estimated_rewards_in_window_sats",
"workers_hashing"
]
# --- Bucket by second (Los Angeles Time) with thread safety ---
from datetime import datetime
from zoneinfo import ZoneInfo
current_second = datetime.now(ZoneInfo(get_timezone())).strftime("%H:%M:%S")
with state_lock:
for key in arrow_keys:
if metrics.get(key) is not None:
current_val = metrics[key]
arrow = ""
# Get the corresponding unit key if available
unit_key = f"{key}_unit"
current_unit = metrics.get(unit_key, "")
if key in arrow_history and arrow_history[key]:
try:
previous_val = arrow_history[key][-1]["value"]
previous_unit = arrow_history[key][-1].get("unit", "")
previous_arrow = arrow_history[key][-1].get("arrow", "") # Get previous arrow
# Use the convert_to_ths function to normalize both values before comparison
if key.startswith("hashrate") and current_unit:
from models import convert_to_ths
norm_curr_val = convert_to_ths(float(current_val), current_unit)
norm_prev_val = convert_to_ths(float(previous_val), previous_unit if previous_unit else "th/s")
# Lower the threshold to 0.05% for more sensitivity
if norm_curr_val > norm_prev_val * 1.0001:
arrow = ""
elif norm_curr_val < norm_prev_val * 0.9999:
arrow = ""
else:
arrow = previous_arrow # Preserve previous arrow if change is insignificant
else:
# For non-hashrate values or when units are missing
# Try to convert to float for comparison
try:
curr_float = float(current_val)
prev_float = float(previous_val)
# Lower the threshold to 0.05% for more sensitivity
if curr_float > prev_float * 1.0001:
arrow = ""
elif curr_float < prev_float * 0.9999:
arrow = ""
else:
arrow = previous_arrow # Preserve previous arrow
except (ValueError, TypeError):
# If values can't be converted to float, compare directly
if current_val != previous_val:
arrow = "" if current_val > previous_val else ""
else:
arrow = previous_arrow # Preserve previous arrow
except Exception as e:
logging.error(f"Error calculating arrow for {key}: {e}")
# Keep previous arrow on error instead of empty string
if arrow_history[key] and arrow_history[key][-1].get("arrow"):
arrow = arrow_history[key][-1]["arrow"]
if key not in arrow_history:
arrow_history[key] = []
if not arrow_history[key] or arrow_history[key][-1]["time"] != current_second:
# Create new entry
entry = {
"time": current_second,
"value": current_val,
"arrow": arrow,
}
# Add unit information if available
if current_unit:
entry["unit"] = current_unit
arrow_history[key].append(entry)
else:
# Update existing entry
arrow_history[key][-1]["value"] = current_val
# Only update arrow if it's not empty - this preserves arrows between changes
if arrow:
arrow_history[key][-1]["arrow"] = arrow
# Update unit if available
if current_unit:
arrow_history[key][-1]["unit"] = current_unit
# Cap history to three hours worth (180 entries)
if len(arrow_history[key]) > MAX_HISTORY_ENTRIES:
arrow_history[key] = arrow_history[key][-MAX_HISTORY_ENTRIES:]
# --- Aggregate arrow_history by minute for the graph ---
aggregated_history = {}
for key, entries in arrow_history.items():
minute_groups = {}
for entry in entries:
minute = entry["time"][:5] # extract HH:MM
minute_groups[minute] = entry # take last entry for that minute
# Sort by time to ensure chronological order
aggregated_history[key] = sorted(list(minute_groups.values()),
key=lambda x: x["time"])
# Only keep the most recent 60 data points for the graph display
aggregated_history[key] = aggregated_history[key][-60:] if len(aggregated_history[key]) > 60 else aggregated_history[key]
metrics["arrow_history"] = aggregated_history
metrics["history"] = hashrate_history
entry = {"timestamp": datetime.now().isoformat(), "metrics": metrics}
metrics_log.append(entry)
# Cap the metrics log to three hours worth (180 entries)
if len(metrics_log) > MAX_HISTORY_ENTRIES:
metrics_log = metrics_log[-MAX_HISTORY_ENTRIES:]
def save_notifications(self, notifications):
"""Save notifications to persistent storage."""
try:
# If we have Redis, use it
if self.redis_client:
notifications_json = json.dumps(notifications)
self.redis_client.set("dashboard_notifications", notifications_json)
return True
else:
# Otherwise just keep in memory
return True
except Exception as e:
logging.error(f"Error saving notifications: {e}")
return False
def get_notifications(self):
"""Retrieve notifications from persistent storage."""
try:
# If we have Redis, use it
if self.redis_client:
notifications_json = self.redis_client.get("dashboard_notifications")
if notifications_json:
return json.loads(notifications_json)
# Return empty list if not found or no Redis
return []
except Exception as e:
logging.error(f"Error retrieving notifications: {e}")
return []

371
static/css/blocks.css Normal file
View File

@ -0,0 +1,371 @@
/* Styles specific to the blocks page */
/* Block controls */
.block-controls {
display: flex;
flex-wrap: wrap;
justify-content: space-between;
align-items: center;
gap: 10px;
}
.block-control-item {
display: flex;
align-items: center;
gap: 10px;
}
.block-input {
background-color: var(--bg-color) !important;
border: 1px solid var(--primary-color) !important;
color: var(--text-color);
padding: 5px 10px;
font-family: var(--terminal-font);
width: 150px;
}
.block-input:focus {
outline: none;
box-shadow: 0 0 8px rgba(247, 147, 26, 0.5);
}
.block-button {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
color: var(--primary-color);
padding: 5px 15px;
font-family: var(--terminal-font);
cursor: pointer;
transition: all 0.2s ease;
}
.block-button:hover {
background-color: var(--primary-color);
color: var(--bg-color);
box-shadow: 0 0 10px rgba(247, 147, 26, 0.5);
}
/* Latest block stats */
.latest-block-stats {
display: grid;
grid-template-columns: repeat(auto-fill, minmax(200px, 1fr));
gap: 15px;
}
.stat-item {
display: flex;
flex-direction: column;
}
.stat-item strong {
color: #f7931a; /* Use the Bitcoin orange color for labels */
}
/* Blocks grid */
.blocks-container {
overflow-x: auto;
}
.blocks-grid {
display: grid;
grid-template-columns: repeat(auto-fill, minmax(300px, 1fr));
gap: 15px;
margin-top: 15px;
}
.block-card {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
box-shadow: 0 0 5px rgba(247, 147, 26, 0.3);
position: relative;
overflow: hidden;
padding: 12px;
cursor: pointer;
transition: all 0.2s ease;
}
.block-card:hover {
box-shadow: 0 0 15px rgba(247, 147, 26, 0.5);
transform: translateY(-2px);
}
.block-card::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: repeating-linear-gradient( 0deg, rgba(0, 0, 0, 0.05), rgba(0, 0, 0, 0.05) 1px, transparent 1px, transparent 2px );
pointer-events: none;
z-index: 1;
}
.block-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 8px;
}
.block-height {
font-size: 1.2rem;
font-weight: bold;
color: var(--primary-color);
}
.block-time {
font-size: 0.9rem;
color: #00dfff;
}
.block-info {
display: grid;
grid-template-columns: repeat(2, 1fr);
gap: 8px 15px;
}
.block-info-item {
display: flex;
flex-direction: column;
}
.block-info-label {
font-size: 0.8rem;
color: #aaa;
}
.block-info-value {
font-size: 0.9rem;
}
.block-info-value.yellow {
color: #ffd700;
}
.block-info-value.green {
color: #32CD32;
}
.block-info-value.blue {
color: #00dfff;
}
.block-info-value.white {
color: #ffffff;
}
.block-info-value.red {
color: #ff5555;
}
/* Loader */
.loader {
text-align: center;
padding: 20px;
grid-column: 1 / -1;
}
.loader-text {
display: inline-block;
margin-right: 5px;
}
/* Modal styles */
.block-modal {
display: none;
position: fixed;
z-index: 1000;
left: 0;
top: 0;
width: 100%;
height: 100%;
overflow: auto;
background-color: rgba(0, 0, 0, 0.8);
}
.block-modal-content {
background-color: var(--bg-color);
margin: 5% auto;
border: 1px solid var(--primary-color);
box-shadow: 0 0 20px rgba(247, 147, 26, 0.5);
width: 90%;
max-width: 800px;
max-height: 80vh;
overflow-y: auto;
position: relative;
}
.block-modal-content::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: repeating-linear-gradient( 0deg, rgba(0, 0, 0, 0.05), rgba(0, 0, 0, 0.05) 1px, transparent 1px, transparent 2px );
pointer-events: none;
z-index: 0;
}
.block-modal-header {
background-color: #000;
color: var(--primary-color);
font-weight: bold;
padding: 0.5rem 1rem;
font-size: 1.1rem;
border-bottom: 1px solid var(--primary-color);
animation: flicker 4s infinite;
font-family: var(--header-font);
display: flex;
justify-content: space-between;
align-items: center;
position: relative;
z-index: 1;
}
.block-modal-close {
color: var(--primary-color);
float: right;
font-size: 28px;
font-weight: bold;
cursor: pointer;
}
.block-modal-close:hover,
.block-modal-close:focus {
color: #ffa500;
}
.block-modal-body {
padding: 1rem;
position: relative;
z-index: 1;
}
#block-details {
display: grid;
grid-template-columns: repeat(auto-fill, minmax(250px, 1fr));
gap: 15px;
}
.block-detail-section {
margin-bottom: 15px;
}
.block-detail-title {
font-size: 1.1rem;
color: var(--primary-color);
margin-bottom: 10px;
font-weight: bold;
}
.block-detail-item {
margin-bottom: 8px;
}
.block-detail-label {
font-size: 0.9rem;
color: #aaa;
}
.block-detail-value {
font-size: 0.9rem;
word-break: break-all;
}
.block-hash {
font-family: monospace;
font-size: 0.8rem;
color: #00dfff;
word-break: break-all;
}
.transaction-data {
display: flex;
flex-direction: column;
gap: 5px;
}
.fee-bar-container {
height: 5px;
width: 100%;
background-color: rgba(255, 255, 255, 0.1);
margin-top: 5px;
position: relative;
overflow: hidden;
}
.fee-bar {
height: 100%;
width: 0;
background: linear-gradient(90deg, #32CD32, #ffd700);
transition: width 0.5s ease;
}
/* Mining Animation Container */
.mining-animation-container {
padding: 0;
background-color: #0a0a0a;
overflow: hidden;
position: relative;
width: 100%;
}
.mining-animation-container::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: repeating-linear-gradient( 0deg, rgba(0, 0, 0, 0.05), rgba(0, 0, 0, 0.05) 1px, transparent 1px, transparent 2px );
pointer-events: none;
z-index: 1;
}
#svg-container {
width: 100%;
height: 300px;
overflow: hidden;
display: flex;
justify-content: center;
align-items: center; /* Add this to center vertically if needed */
}
svg {
max-width: 100%;
height: auto;
display: block; /* Ensures proper centering */
}
/* Make sure the SVG itself takes more width */
#block-mining-animation {
width: 100%;
height: 300px;
/* Fixed height but full width */
}
/* Mobile responsiveness */
@media (max-width: 768px) {
.latest-block-stats {
grid-template-columns: repeat(auto-fill, minmax(150px, 1fr));
}
.blocks-grid {
grid-template-columns: 1fr;
}
.block-modal-content {
width: 95%;
margin: 10% auto;
}
#block-details {
grid-template-columns: 1fr;
}
#svg-container {
height: 250px;
}
}

664
static/css/boot.css Normal file
View File

@ -0,0 +1,664 @@
/* Config form styling - fixed width and hidden by default */
#config-form {
display: none;
width: 500px;
max-width: 90%;
margin: 30px auto;
padding: 20px;
background-color: #0d0d0d;
border: 1px solid #f7931a;
box-shadow: 0 0 10px rgba(247, 147, 26, 0.5);
border-radius: 4px;
}
/* Boot text color - updated with theme toggling */
body:not(.deepsea-theme) #terminal,
body:not(.deepsea-theme) #output,
body:not(.deepsea-theme) #prompt-container,
body:not(.deepsea-theme) #prompt-text,
body:not(.deepsea-theme) #user-input,
body:not(.deepsea-theme) #loading-message {
color: #f7931a;
}
/* DeepSea theme text color */
body.deepsea-theme #terminal,
body.deepsea-theme #output,
body.deepsea-theme #prompt-container,
body.deepsea-theme #prompt-text,
body.deepsea-theme #user-input,
body.deepsea-theme #loading-message {
color: #0088cc;
}
/* DeepSea cursor color */
body.deepsea-theme .cursor,
body.deepsea-theme .prompt-cursor {
background-color: #0088cc;
box-shadow: 0 0 5px rgba(0, 136, 204, 0.8);
}
/* Boot-specific DeepSea theme adjustments */
body.deepsea-theme #bitcoin-logo {
color: #0088cc;
border-color: #0088cc;
text-shadow: 0 0 10px rgba(0, 136, 204, 0.5);
box-shadow: 0 0 15px rgba(0, 136, 204, 0.5);
}
body.deepsea-theme #config-form {
border: 1px solid #0088cc;
box-shadow: 0 0 10px rgba(0, 136, 204, 0.5);
}
body.deepsea-theme .config-title {
color: #0088cc;
}
body.deepsea-theme .form-group label {
color: #0088cc;
}
body.deepsea-theme .form-group input,
body.deepsea-theme .form-group select {
border: 1px solid #0088cc;
}
body.deepsea-theme .form-group input:focus,
body.deepsea-theme .form-group select:focus {
box-shadow: 0 0 5px #0088cc;
}
body.deepsea-theme .btn {
background-color: #0088cc;
}
body.deepsea-theme .btn:hover {
background-color: #00b3ff;
}
body.deepsea-theme .btn-secondary {
background-color: #333;
color: #0088cc;
}
body.deepsea-theme .tooltip .tooltip-text {
border: 1px solid #0088cc;
}
body.deepsea-theme .form-group select {
background-image: linear-gradient(45deg, transparent 50%, #0088cc 50%), linear-gradient(135deg, #0088cc 50%, transparent 50%);
}
/* DeepSea skip button */
body.deepsea-theme #skip-button {
background-color: #0088cc;
box-shadow: 0 0 8px rgba(0, 136, 204, 0.5);
}
body.deepsea-theme #skip-button:hover {
background-color: #00b3ff;
box-shadow: 0 0 12px rgba(0, 136, 204, 0.7);
}
/* Original Bitcoin styling preserved by default */
.config-title {
font-size: 24px;
text-align: center;
margin-bottom: 20px;
color: #f7931a;
}
.form-group {
margin-bottom: 15px;
}
.form-group label {
display: block;
margin-bottom: 5px;
color: #f7931a;
}
.form-group input,
.form-group select {
width: 100%;
padding: 8px;
background-color: #0d0d0d;
border: 1px solid #f7931a;
color: #fff;
font-family: 'VT323', monospace;
font-size: 18px;
}
.form-group input:focus,
.form-group select:focus {
outline: none;
box-shadow: 0 0 5px #f7931a;
}
.form-actions {
display: flex;
justify-content: space-between;
margin-top: 20px;
}
.btn {
padding: 8px 16px;
background-color: #f7931a;
color: #000;
border: none;
cursor: pointer;
font-family: 'VT323', monospace;
font-size: 18px;
}
.btn:hover {
background-color: #ffa32e;
}
.btn-secondary {
background-color: #333;
color: #f7931a;
}
#form-message {
margin-top: 15px;
padding: 10px;
border-radius: 3px;
display: none;
}
.message-success {
background-color: rgba(50, 205, 50, 0.2);
border: 1px solid #32CD32;
color: #32CD32;
}
.message-error {
background-color: rgba(255, 0, 0, 0.2);
border: 1px solid #ff0000;
color: #ff0000;
}
.tooltip {
position: relative;
display: inline-block;
margin-left: 5px;
width: 14px;
height: 14px;
background-color: #333;
color: #fff;
border-radius: 50%;
text-align: center;
line-height: 14px;
font-size: 10px;
cursor: help;
}
.tooltip .tooltip-text {
visibility: hidden;
width: 200px;
background-color: #000;
color: #fff;
text-align: center;
border-radius: 3px;
padding: 5px;
position: absolute;
z-index: 1;
bottom: 125%;
left: 50%;
margin-left: -100px;
opacity: 0;
transition: opacity 0.3s;
font-size: 14px;
border: 1px solid #f7931a;
}
.tooltip:hover .tooltip-text {
visibility: visible;
opacity: 1;
}
/* Style the select dropdown with custom arrow */
.form-group select {
appearance: none;
-webkit-appearance: none;
-moz-appearance: none;
background-image: linear-gradient(45deg, transparent 50%, #f7931a 50%), linear-gradient(135deg, #f7931a 50%, transparent 50%);
background-position: calc(100% - 15px) calc(1em + 0px), calc(100% - 10px) calc(1em + 0px);
background-size: 5px 5px, 5px 5px;
background-repeat: no-repeat;
padding-right: 30px;
}
/* Base styling for the Bitcoin logo */
#bitcoin-logo {
position: relative;
white-space: pre;
font-family: monospace;
height: 130px; /* Set fixed height to match original logo */
display: flex;
align-items: center;
justify-content: center;
flex-direction: column;
}
/* Update the DeepSea theme logo styling */
body.deepsea-theme #bitcoin-logo {
color: transparent; /* Hide original logo */
position: relative;
text-shadow: none;
min-height: 120px; /* Ensure enough height for the new logo */
}
/* Add the new DeepSea ASCII art */
body.deepsea-theme #bitcoin-logo::after {
content: " ____ ____ \A| _ \\ ___ ___ _ __/ ___| ___ __ _ \A| | | |/ _ \\/ _ \\ '_ \\___ \\ / _ \\/ _` |\A| |_| | __/ __/ |_) |__) | __/ (_| |\A|____/ \\___|\\___|_.__/____/ \\___|\\__,_|\A|_| ";
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%); /* Center perfectly */
font-size: 100%; /* Full size */
font-weight: bold;
line-height: 1.2;
color: #0088cc;
white-space: pre;
display: block;
text-shadow: 0 0 10px rgba(0, 136, 204, 0.5);
font-family: monospace;
z-index: 1;
padding: 10px 0;
}
/* Add "DeepSea" version info */
body.deepsea-theme #bitcoin-logo::before {
content: "v.21";
position: absolute;
bottom: 0;
right: 10px;
color: #0088cc;
font-size: 16px;
text-shadow: 0 0 5px rgba(0, 136, 204, 0.5);
font-family: 'VT323', monospace;
z-index: 2; /* Ensure version displays on top */
}
/* Ocean Wave Ripple Effect for DeepSea Theme */
body.deepsea-theme::after {
content: "";
position: fixed;
top: 0;
left: 0;
right: 0;
bottom: 0;
pointer-events: none;
background: transparent;
opacity: 0.1;
z-index: 10;
animation: oceanRipple 8s infinite linear;
background-image: repeating-linear-gradient( 0deg, rgba(0, 136, 204, 0.1), rgba(0, 136, 204, 0.1) 1px, transparent 1px, transparent 6px );
background-size: 100% 6px;
}
/* Ocean waves moving animation */
@keyframes oceanRipple {
0% {
transform: translateY(0);
}
100% {
transform: translateY(6px);
}
}
/* Retro glitch effect for DeepSea Theme */
body.deepsea-theme::before {
content: "";
position: fixed;
top: 0;
left: 0;
right: 0;
bottom: 0;
pointer-events: none;
z-index: 3;
opacity: 0.15;
background-image: linear-gradient(rgba(18, 16, 16, 0) 50%, rgba(0, 73, 109, 0.1) 50%), linear-gradient(90deg, rgba(0, 81, 122, 0.03), rgba(0, 136, 204, 0.08), rgba(0, 191, 255, 0.03));
background-size: 100% 2px, 3px 100%;
animation: glitchEffect 2s infinite;
}
/* Glitch animation */
@keyframes glitchEffect {
0% {
opacity: 0.15;
background-position: 0 0;
}
20% {
opacity: 0.17;
}
40% {
opacity: 0.14;
background-position: -1px 0;
}
60% {
opacity: 0.15;
background-position: 1px 0;
}
80% {
opacity: 0.16;
background-position: -2px 0;
}
100% {
opacity: 0.15;
background-position: 0 0;
}
}
/* Deep underwater light rays */
body.deepsea-theme {
position: relative;
overflow: hidden;
}
body.deepsea-theme .underwater-rays {
position: fixed;
top: -50%;
left: -50%;
right: -50%;
bottom: -50%;
width: 200%;
height: 200%;
background: rgba(0, 0, 0, 0);
pointer-events: none;
z-index: 1;
background-image: radial-gradient(ellipse at top, rgba(0, 136, 204, 0.1) 0%, rgba(0, 136, 204, 0) 70%), radial-gradient(ellipse at bottom, rgba(0, 91, 138, 0.15) 0%, rgba(0, 0, 0, 0) 70%);
animation: lightRays 15s ease infinite alternate;
}
/* Light ray animation */
@keyframes lightRays {
0% {
transform: rotate(0deg) scale(1);
opacity: 0.3;
}
50% {
opacity: 0.4;
}
100% {
transform: rotate(360deg) scale(1.1);
opacity: 0.3;
}
}
/* Subtle digital noise texture */
body.deepsea-theme .digital-noise {
position: fixed;
top: 0;
left: 0;
right: 0;
bottom: 0;
background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAADIAAAAyCAYAAAAeP4ixAAAABmJLR0QA/wD/AP+gvaeTAAAACXBIWXMAAA3XAAAN1wFCKJt4AAAAB3RJTUUH4woEFQwNDaabTQAAABl0RVh0Q29tbWVudABDcmVhdGVkIHdpdGggR0lNUFeBDhcAAACASURBVGje7dixDcIwFEbhb8QMKWn5dwEWY4fswAasRJkBkhfAIarsNDEF5x5LrV/dJ1cEAAAAAOzHuefF5byzZ7tS6xDj6qoQpdRxUvNM6lH3rPeM1+ZJ3ROtqe9feGcjY8z74M8UvJGxEVHxTcIbGSsR+SECAAAAsC9/8G82GwHDD80AAAAASUVORK5CYII=');
opacity: 0.05;
z-index: 2;
pointer-events: none;
animation: noise 0.5s steps(5) infinite;
}
/* Noise animation */
@keyframes noise {
0% {
transform: translate(0, 0);
}
20% {
transform: translate(-1px, 1px);
}
40% {
transform: translate(1px, -1px);
}
60% {
transform: translate(-2px, -1px);
}
80% {
transform: translate(2px, 1px);
}
100% {
transform: translate(0, 0);
}
}
/* Base Styles with a subtle radial background for extra depth */
body {
background: linear-gradient(135deg, #121212, #000000);
color: #f7931a;
font-family: 'VT323', monospace;
font-size: 20px;
line-height: 1.4;
margin: 0;
padding: 10px;
overflow-x: hidden;
height: calc(100vh - 100px);
display: flex;
flex-direction: column;
}
/* CRT Screen Effect */
body::before {
content: " ";
display: block;
position: fixed;
top: 0; left: 0; bottom: 0; right: 0;
background: linear-gradient(rgba(18, 16, 16, 0) 50%, rgba(0, 0, 0, 0.1) 50%),
linear-gradient(90deg, rgba(255, 0, 0, 0.03), rgba(0, 255, 0, 0.02), rgba(0, 0, 255, 0.03));
background-size: 100% 2px, 3px 100%;
pointer-events: none;
z-index: 2;
opacity: 0.15;
}
/* Flicker Animation */
@keyframes flicker {
0% { opacity: 0.97; }
5% { opacity: 0.95; }
10% { opacity: 0.97; }
15% { opacity: 0.94; }
20% { opacity: 0.98; }
50% { opacity: 0.95; }
80% { opacity: 0.96; }
90% { opacity: 0.94; }
100% { opacity: 0.98; }
}
/* Terminal Window with scrolling enabled */
#terminal {
width: 100%;
max-width: 900px;
margin: 0 auto;
white-space: pre-wrap;
word-break: break-word;
animation: flicker 4s infinite;
height: 400px;
overflow-y: auto;
position: relative;
flex: 1;
}
#terminal-content {
position: absolute;
bottom: 0;
width: 100%;
}
.cursor {
display: inline-block;
width: 10px;
height: 16px;
background-color: #f7931a;
animation: blink 1s step-end infinite;
vertical-align: middle;
box-shadow: 0 0 5px rgba(247, 147, 26, 0.8);
}
@keyframes blink {
0%, 100% { opacity: 1; }
50% { opacity: 0; }
}
/* Neon-inspired color classes */
.green {
color: #39ff14 !important;
}
.blue {
color: #00dfff !important;
}
.yellow {
color: #ffd700 !important;
}
.white {
color: #ffffff !important;
}
.red {
color: #ff2d2d !important;
}
.magenta {
color: #ff2d95 !important;
}
/* Bitcoin Logo styling with extra neon border */
#bitcoin-logo {
display: block;
visibility: hidden;
text-align: center;
margin: 10px auto;
font-size: 10px;
line-height: 1;
color: #f7931a;
text-shadow: 0 0 10px rgba(247, 147, 26, 0.8);
white-space: pre;
width: 260px;
padding: 10px;
border: 2px solid #f7931a;
background-color: #0a0a0a;
box-shadow: 0 0 15px rgba(247, 147, 26, 0.5);
font-family: monospace;
opacity: 0;
transition: opacity 1s ease;
}
/* Skip Button */
#skip-button {
position: fixed;
bottom: 20px;
right: 20px;
background-color: #f7931a;
color: #000;
border: none;
padding: 10px 15px;
border-radius: 5px;
cursor: pointer;
font-family: 'VT323', monospace;
font-size: 16px;
box-shadow: 0 0 8px rgba(247, 147, 26, 0.5);
transition: all 0.2s ease;
z-index: 50; /* Lower z-index value */
}
#skip-button:hover {
background-color: #ffa32e;
box-shadow: 0 0 12px rgba(247, 147, 26, 0.7);
}
/* Mobile-specific adjustments */
@media (max-width: 768px) {
#skip-button {
bottom: 25px;
right: 10px;
padding: 10px 18px; /* Larger touch target for mobile */
font-size: 18px;
height: 40px;
z-index: 50;
}
}
/* Add this to your CSS */
#config-form {
z-index: 100; /* Higher than the skip button */
position: relative; /* Needed for z-index to work properly */
}
/* Prompt Styling */
#prompt-container {
display: none;
white-space: nowrap;
}
#prompt-text {
color: #f7931a;
margin-right: 5px;
display: inline;
}
#user-input {
background: transparent;
border: none;
color: #f7931a;
font-family: 'VT323', monospace;
font-size: 20px;
caret-color: transparent;
outline: none;
width: 35px;
height: 33px;
padding: 0;
margin: 0;
display: inline-block;
vertical-align: top;
}
.prompt-cursor {
display: inline-block;
width: 10px;
height: 16px;
background-color: #f7931a;
animation: blink 1s step-end infinite;
vertical-align: middle;
box-shadow: 0 0 5px rgba(247, 147, 26, 0.8);
position: relative;
top: 1px;
margin-left: -2px;
}
/* Mobile Responsiveness */
@media (max-width: 600px) {
body { font-size: 14px; padding: 10px; }
#terminal { margin: 0; }
}
/* Loading and Debug Info */
#loading-message {
text-align: center;
margin-bottom: 10px;
}
#debug-info {
position: fixed;
bottom: 10px;
left: 10px;
color: #666;
font-size: 12px;
z-index: 100;
}

501
static/css/common.css Normal file
View File

@ -0,0 +1,501 @@
.footer {
margin-top: 30px;
padding: 10px 0;
color: grey;
font-size: 0.9rem;
border-top: 1px solid rgba(128, 128, 128, 0.2);
}
</style >
<!-- Preload theme to prevent flicker -->
<style id="theme-preload" >
/* Theme-aware loading state */
html.bitcoin-theme {
background-color: #111111;
}
html.deepsea-theme {
background-color: #0c141a;
}
#theme-loader {
position: fixed;
top: 0;
left: 0;
width: 100vw;
height: 100vh;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
z-index: 9999;
font-family: 'VT323', monospace;
}
html.bitcoin-theme #theme-loader {
background-color: #111111;
color: #f2a900;
}
html.deepsea-theme #theme-loader {
background-color: #0c141a;
color: #0088cc;
}
#loader-icon {
font-size: 48px;
margin-bottom: 20px;
animation: spin 2s infinite linear;
}
#loader-text {
font-size: 24px;
text-transform: uppercase;
}
@keyframes spin {
0% {
transform: rotate(0deg);
}
100% {
transform: rotate(360deg);
}
}
/* Hide content during load */
body {
visibility: hidden;
}
/* Common styling elements shared across all pages */
:root {
--bg-color: #0a0a0a;
--bg-gradient: linear-gradient(135deg, #0a0a0a, #1a1a1a);
--primary-color: #f7931a;
--accent-color: #00ffff;
--text-color: #ffffff;
--card-padding: 0.5rem;
--text-size-base: 16px;
--terminal-font: 'VT323', monospace;
--header-font: 'Orbitron', sans-serif;
--text-transform: uppercase;
}
@media (min-width: 768px) {
:root {
--card-padding: 0.75rem;
--text-size-base: 18px;
}
}
/* CRT Screen Effect */
body::before {
content: " ";
display: block;
position: fixed;
top: 0; left: 0; bottom: 0; right: 0;
background: linear-gradient(rgba(18, 16, 16, 0) 50%, rgba(0, 0, 0, 0.1) 50%),
linear-gradient(90deg, rgba(255, 0, 0, 0.03), rgba(0, 255, 0, 0.02), rgba(0, 0, 255, 0.03));
background-size: 100% 2px, 3px 100%;
pointer-events: none;
z-index: 2;
opacity: 0.15;
}
/* Flicker Animation */
@keyframes flicker {
0% { opacity: 0.97; }
5% { opacity: 0.95; }
10% { opacity: 0.97; }
15% { opacity: 0.94; }
20% { opacity: 0.98; }
50% { opacity: 0.95; }
80% { opacity: 0.96; }
90% { opacity: 0.94; }
100% { opacity: 0.98; }
}
body {
background: var(--bg-gradient);
color: var(--text-color);
padding-top: 0.5rem;
font-size: var(--text-size-base);
font-family: var(--terminal-font);
text-transform: uppercase;
}
h1 {
font-size: 24px;
font-weight: bold;
color: var(--primary-color);
font-family: var(--header-font);
letter-spacing: 1px;
animation: flicker 4s infinite;
}
@media (min-width: 768px) {
h1 {
font-size: 26px;
}
}
/* Navigation links */
.navigation-links {
display: flex;
justify-content: center;
margin-top: 10px;
margin-bottom: 15px;
}
.nav-link {
padding: 5px 15px;
margin: 0 10px;
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
color: var(--primary-color);
text-decoration: none;
font-family: var(--terminal-font);
transition: all 0.3s ease;
}
.nav-link:hover {
background-color: var(--primary-color);
color: var(--bg-color);
box-shadow: 0 0 10px rgba(247, 147, 26, 0.5);
}
.nav-link.active {
background-color: var(--primary-color);
color: var(--bg-color);
box-shadow: 0 0 10px rgba(247, 147, 26, 0.5);
}
/* Top right link */
#topRightLink {
position: absolute;
top: 10px;
right: 10px;
color: grey;
text-decoration: none;
font-size: 0.7rem; /* Decreased font size */
padding: 5px 10px; /* Add padding for a larger clickable area */
transition: background-color 0.3s ease; /* Optional: Add hover effect */
}
#topRightLink:hover {
background-color: rgba(255, 255, 255, 0.1); /* Optional: Highlight on hover */
}
/* Card styles */
.card,
.card-header,
.card-body,
.card-footer {
border-radius: 0 !important;
text-transform: uppercase;
}
/* Enhanced card with scanlines */
.card {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
margin-bottom: 0.5rem;
padding: var(--card-padding);
flex: 1;
position: relative;
overflow: hidden;
box-shadow: 0 0 5px rgba(247, 147, 26, 0.3);
}
/* Scanline effect for cards */
.card::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: repeating-linear-gradient(
0deg,
rgba(0, 0, 0, 0.05),
rgba(0, 0, 0, 0.05) 1px,
transparent 1px,
transparent 2px
);
pointer-events: none;
z-index: 1;
}
.card-header {
background-color: #000;
color: var(--primary-color);
font-weight: bold;
padding: 0.3rem 0.5rem;
font-size: 1.1rem;
border-bottom: 1px solid var(--primary-color);
animation: flicker 4s infinite;
font-family: var(--header-font);
}
.card-body hr {
border-top: 1px solid var(--primary-color);
margin: 0.25rem 0;
}
/* Connection status indicator */
#connectionStatus {
display: none;
position: fixed;
top: 10px;
right: 10px;
background: rgba(255,0,0,0.7);
color: white;
padding: 10px;
border-radius: 5px;
z-index: 9999;
font-size: 0.9rem;
box-shadow: 0 0 10px rgba(255, 0, 0, 0.5);
}
/* Last Updated text with subtle animation */
#lastUpdated {
animation: flicker 5s infinite;
text-align: center;
}
/* Cursor blink for terminal feel */
#terminal-cursor {
display: inline-block;
width: 10px;
height: 16px;
background-color: #f7931a;
margin-left: 2px;
animation: blink 1s step-end infinite;
vertical-align: middle;
box-shadow: 0 0 5px rgba(247, 147, 26, 0.8);
}
@keyframes blink {
0%, 100% { opacity: 1; }
50% { opacity: 0; }
}
/* Container */
.container-fluid {
max-width: 1200px;
margin: 0 auto;
padding-left: 1rem;
padding-right: 1rem;
position: relative;
}
/* Status indicators */
.online-dot {
display: inline-block;
width: 8px;
height: 8px;
background: #32CD32;
border-radius: 50%;
margin-left: 0.5em;
position: relative;
top: -1px;
animation: glow 3s infinite;
box-shadow: 0 0 10px #32CD32, 0 0 20px #32CD32;
}
@keyframes glow {
0%, 100% { box-shadow: 0 0 10px #32CD32, 0 0 15px #32CD32; }
50% { box-shadow: 0 0 15px #32CD32, 0 0 25px #32CD32; }
}
.offline-dot {
display: inline-block;
width: 8px;
height: 8px;
background: red;
border-radius: 50%;
margin-left: 0.5em;
position: relative;
top: -1px;
animation: glow 3s infinite;
box-shadow: 0 0 10px red, 0 0 20px red !important;
}
@keyframes glowRed {
0%, 100% { box-shadow: 0 0 10px red, 0 0 15px red; }
50% { box-shadow: 0 0 15px red, 0 0 25px red; }
}
/* Color utility classes */
.green-glow, .status-green {
color: #39ff14 !important;
}
.red-glow, .status-red {
color: #ff2d2d !important;
}
.yellow-glow {
color: #ffd700 !important;
}
.blue-glow {
color: #00dfff !important;
}
.white-glow {
color: #ffffff !important;
}
/* Basic color classes for backward compatibility */
.green {
color: #39ff14 !important;
}
.blue {
color: #00dfff !important;
}
.yellow {
color: #ffd700 !important;
font-weight: normal !important;
}
.white {
color: #ffffff !important;
}
.red {
color: #ff2d2d !important;
}
.magenta {
color: #ff2d95 !important;
}
/* Bitcoin Progress Bar Styles */
.bitcoin-progress-container {
width: 100%;
max-width: 300px;
height: 20px;
background-color: #111;
border: 1px solid var(--primary-color);
border-radius: 0;
margin: 0.5rem auto;
position: relative;
overflow: hidden;
box-shadow: 0 0 8px rgba(247, 147, 26, 0.5);
align-self: center;
}
.bitcoin-progress-inner {
height: 100%;
width: 0;
background: linear-gradient(90deg, #f7931a, #ffa500);
border-radius: 0;
transition: width 0.3s ease;
position: relative;
overflow: hidden;
}
.bitcoin-progress-inner::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: linear-gradient(90deg,
rgba(255, 255, 255, 0.1) 0%,
rgba(255, 255, 255, 0.2) 20%,
rgba(255, 255, 255, 0.1) 40%);
animation: shimmer 2s infinite;
}
@keyframes shimmer {
0% { transform: translateX(-100%); }
100% { transform: translateX(100%); }
}
.bitcoin-icons {
position: absolute;
top: 50%;
left: 0;
width: 100%;
transform: translateY(-50%);
display: flex;
justify-content: space-around;
font-size: 12px;
color: rgba(0, 0, 0, 0.7);
}
.glow-effect {
box-shadow: 0 0 15px #f7931a, 0 0 25px #f7931a;
animation: pulse 1s infinite;
}
/* Extra styling for when server update is late */
.waiting-for-update {
animation: waitingPulse 2s infinite !important;
}
@keyframes waitingPulse {
0%, 100% { box-shadow: 0 0 10px #f7931a, 0 0 15px #f7931a; opacity: 0.8; }
50% { box-shadow: 0 0 20px #f7931a, 0 0 35px #f7931a; opacity: 1; }
}
@keyframes pulse {
0%, 100% { opacity: 1; }
50% { opacity: 0.8; }
}
#progress-text {
font-size: 1rem;
color: var(--primary-color);
margin-top: 0.3rem;
text-align: center;
width: 100%;
}
/* Mobile responsiveness */
@media (max-width: 576px) {
.container-fluid {
padding-left: 0.5rem;
padding-right: 0.5rem;
}
.card-body {
padding: 0.5rem;
}
h1 {
font-size: 22px;
}
.card-header {
font-size: 1rem;
}
#topRightLink {
position: static;
display: block;
text-align: right;
margin-bottom: 0.5rem;
}
}
/* Navigation badges for notifications */
.nav-badge {
background-color: var(--primary-color);
color: var(--bg-color);
border-radius: 10px;
font-size: 0.7rem;
padding: 1px 5px;
min-width: 16px;
text-align: center;
display: none;
margin-left: 5px;
vertical-align: middle;
}

247
static/css/dashboard.css Normal file
View File

@ -0,0 +1,247 @@
/* Specific styles for the main dashboard */
#graphContainer {
background-color: #000;
padding: 0.5rem;
margin-bottom: 1rem;
height: 230px;
border: 1px solid var(--primary-color);
box-shadow: 0 0 10px rgba(247, 147, 26, 0.2);
position: relative;
}
/* Add scanline effect to graph */
#graphContainer::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: repeating-linear-gradient(
0deg,
rgba(0, 0, 0, 0.1),
rgba(0, 0, 0, 0.1) 1px,
transparent 1px,
transparent 2px
);
pointer-events: none;
z-index: 1;
}
/* Override for Payout & Misc card */
#payoutMiscCard {
margin-bottom: 0.5rem;
}
/* Row equal height for card alignment */
.row.equal-height {
display: flex;
flex-wrap: wrap;
margin-bottom: 1rem;
}
.row.equal-height > [class*="col-"] {
display: flex;
margin-bottom: 0.5rem;
}
.row.equal-height > [class*="col-"] .card {
width: 100%;
}
/* Arrow indicator styles */
.arrow {
display: inline-block;
font-weight: bold;
margin-left: 0.5rem;
}
/* Bounce animations for indicators */
@keyframes bounceUp {
0% { transform: translateY(0); }
25% { transform: translateY(-2px); }
50% { transform: translateY(0); }
75% { transform: translateY(-2px); }
100% { transform: translateY(0); }
}
@keyframes bounceDown {
0% { transform: translateY(0); }
25% { transform: translateY(2px); }
50% { transform: translateY(0); }
75% { transform: translateY(2px); }
100% { transform: translateY(0); }
}
.bounce-up {
animation: bounceUp 1s infinite;
}
.bounce-down {
animation: bounceDown 1s infinite;
}
.chevron {
font-size: 0.8rem;
position: relative;
}
/* Refresh timer container */
#refreshUptime {
text-align: center;
margin-top: 0.5rem;
}
#refreshContainer {
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
width: 100%;
}
#uptimeTimer strong {
font-weight: bold;
}
#uptimeTimer {
margin-top: 0;
}
/* Metric styling by category */
.metric-value {
color: var(--text-color);
font-weight: bold;
}
/* Yellow color family (BTC price, sats metrics, time to payout) */
#btc_price,
#daily_mined_sats,
#monthly_mined_sats,
#estimated_earnings_per_day_sats,
#estimated_earnings_next_block_sats,
#estimated_rewards_in_window_sats,
#est_time_to_payout {
color: #ffd700;
}
/* Green color family (profits, earnings) */
#unpaid_earnings,
#daily_revenue,
#daily_profit_usd,
#monthly_profit_usd {
color: #32CD32;
}
/* Red color family (costs) */
#daily_power_cost {
color: #ff5555 !important;
}
/* White metrics (general stats) */
.metric-value.white,
#block_number,
#network_hashrate,
#difficulty,
#workers_hashing,
#last_share,
#blocks_found,
#last_block_height,
#pool_fees_percentage {
color: #ffffff;
}
/* Blue metrics (time data) */
#last_block_time {
color: #00dfff;
}
.card-body strong {
color: var(--primary-color);
margin-right: 0.25rem;
}
.card-body p {
margin: 0.25rem 0;
line-height: 1.2;
}
/* Hidden Congrats Message */
#congratsMessage {
display: none;
position: fixed;
top: 20px;
left: 50%;
transform: translateX(-50%);
z-index: 1000;
background: #f7931a;
color: #000;
padding: 10px;
border-radius: 5px;
box-shadow: 0 0 15px rgba(247, 147, 26, 0.7);
text-transform: uppercase;
}
/* Add bottom padding to accommodate minimized system monitor */
.container-fluid {
padding-bottom: 60px !important; /* Enough space for minimized monitor */
}
/* Add these styles to dashboard.css */
@keyframes pulse-block-marker {
0% {
transform: translate(-50%, -50%) rotate(45deg) scale(1);
opacity: 1;
}
50% {
transform: translate(-50%, -50%) rotate(45deg) scale(1.3);
opacity: 0.8;
}
100% {
transform: translate(-50%, -50%) rotate(45deg) scale(1);
opacity: 1;
}
}
.chart-container-relative {
position: relative;
}
/* Styling for optimal fee indicator */
.fee-star {
color: gold;
margin-left: 4px;
font-size: 1.2em;
vertical-align: middle;
}
.datum-label {
color: #ffffff; /* White color */
font-size: 0.95em;
font-weight: bold;
text-transform: uppercase;
margin-left: 4px;
background-color: rgba(0, 0, 0, 0.2);
border-radius: 3px;
letter-spacing: 2px;
}
/* Pool luck indicators */
.very-lucky {
color: #32CD32 !important;
font-weight: bold !important;
}
.lucky {
color: #90EE90 !important;
}
.normal-luck {
color: #ffd700 !important;
}
.unlucky {
color: #ff5555 !important;
}

134
static/css/error.css Normal file
View File

@ -0,0 +1,134 @@
:root {
--bg-color: #0a0a0a;
--bg-gradient: linear-gradient(135deg, #0a0a0a, #1a1a1a);
--primary-color: #f7931a;
--text-color: white;
--terminal-font: 'VT323', monospace;
--header-font: 'Orbitron', sans-serif;
}
/* CRT Screen Effect */
body::before {
content: " ";
display: block;
position: fixed;
top: 0; left: 0; bottom: 0; right: 0;
background: linear-gradient(rgba(18, 16, 16, 0) 50%, rgba(0, 0, 0, 0.1) 50%),
linear-gradient(90deg, rgba(255, 0, 0, 0.03), rgba(0, 255, 0, 0.02), rgba(0, 0, 255, 0.03));
background-size: 100% 2px, 3px 100%;
pointer-events: none;
z-index: 2;
opacity: 0.15;
}
/* Flicker Animation */
@keyframes flicker {
0% { opacity: 0.97; }
5% { opacity: 0.95; }
10% { opacity: 0.97; }
15% { opacity: 0.94; }
20% { opacity: 0.98; }
50% { opacity: 0.95; }
80% { opacity: 0.96; }
90% { opacity: 0.94; }
100% { opacity: 0.98; }
}
body {
background: var(--bg-gradient);
color: var(--text-color);
padding-top: 50px;
font-family: var(--terminal-font);
}
a.btn-primary {
background-color: var(--primary-color);
border-color: var(--primary-color);
color: black;
margin-top: 20px;
font-family: var(--header-font);
text-shadow: none;
box-shadow: 0 0 10px rgba(247, 147, 26, 0.5);
transition: all 0.3s ease;
}
a.btn-primary:hover {
background-color: #ffa64d;
box-shadow: 0 0 15px rgba(247, 147, 26, 0.7);
}
/* Enhanced error container with scanlines */
.error-container {
max-width: 600px;
margin: 0 auto;
text-align: center;
padding: 2rem;
border: 1px solid var(--primary-color);
border-radius: 0;
background-color: rgba(0, 0, 0, 0.3);
box-shadow: 0 0 15px rgba(247, 147, 26, 0.3);
position: relative;
overflow: hidden;
animation: flicker 4s infinite;
}
/* Scanline effect for error container */
.error-container::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: repeating-linear-gradient(
0deg,
rgba(0, 0, 0, 0.1),
rgba(0, 0, 0, 0.1) 1px,
transparent 1px,
transparent 2px
);
pointer-events: none;
z-index: 1;
}
h1 {
color: var(--primary-color);
margin-bottom: 1rem;
font-family: var(--header-font);
font-weight: bold;
position: relative;
z-index: 2;
}
p {
margin-bottom: 1.5rem;
font-size: 1.5rem;
position: relative;
z-index: 2;
color: #ff5555;
}
/* Cursor blink for terminal feel */
.terminal-cursor {
display: inline-block;
width: 10px;
height: 20px;
background-color: #f7931a;
margin-left: 2px;
animation: blink 1s step-end infinite;
vertical-align: middle;
box-shadow: 0 0 5px rgba(247, 147, 26, 0.8);
}
@keyframes blink {
0%, 100% { opacity: 1; }
50% { opacity: 0; }
}
/* Error code styling */
.error-code {
font-family: var(--terminal-font);
font-size: 1.2rem;
color: #00dfff;
margin-bottom: 1rem;
}

View File

@ -0,0 +1,327 @@
/* notifications.css */
/* Notification Controls */
.notification-controls {
display: flex;
justify-content: space-between;
align-items: center;
flex-wrap: wrap;
gap: 10px;
}
.full-timestamp {
font-size: 0.8em;
color: #888;
}
.filter-buttons {
display: flex;
flex-wrap: wrap;
gap: 5px;
}
.filter-button {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
color: var(--primary-color);
padding: 5px 10px;
font-family: var(--terminal-font);
cursor: pointer;
transition: all 0.3s ease;
text-transform: uppercase;
}
.filter-button:hover {
background-color: rgba(247, 147, 26, 0.2);
}
.filter-button.active {
background-color: var(--primary-color);
color: var(--bg-color);
box-shadow: 0 0 10px rgba(247, 147, 26, 0.5);
}
.notification-actions {
display: flex;
gap: 5px;
align-items: center;
text-transform: uppercase;
}
.action-button {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
color: var(--primary-color);
padding: 6px 12px;
font-family: var(--terminal-font);
cursor: pointer;
transition: all 0.2s ease;
min-width: 80px; /* Set a minimum width to prevent text cutoff */
display: inline-flex;
align-items: center;
justify-content: center;
font-size: 0.9rem; /* Slightly smaller font */
line-height: 1;
text-transform: uppercase;
}
.action-button:hover {
background-color: rgba(247, 147, 26, 0.2);
}
.action-button.danger {
border-color: #ff5555;
color: #ff5555;
}
.action-button.danger:hover {
background-color: rgba(255, 85, 85, 0.2);
}
/* Card header with unread badge */
.card-header {
display: flex;
justify-content: space-between;
align-items: center;
}
.unread-badge {
background-color: var(--primary-color);
color: var(--bg-color);
padding: 2px 8px;
border-radius: 10px;
font-size: 0.8rem;
min-width: 25px;
text-align: center;
}
.unread-badge:empty {
display: none;
}
/* Notifications Container */
#notifications-container {
min-height: 200px;
position: relative;
}
.loading-message {
text-align: center;
padding: 20px;
color: #888;
}
.empty-state {
text-align: center;
padding: 40px 20px;
color: #888;
}
.empty-state i {
font-size: 3rem;
margin-bottom: 15px;
opacity: 0.5;
}
/* Notification Item */
.notification-item {
display: flex;
padding: 12px;
border-bottom: 1px solid rgba(247, 147, 26, 0.2);
transition: background-color 0.2s ease;
position: relative;
background-color: rgba(0, 0, 0, 0.15);
}
.notification-item:hover {
background-color: rgba(247, 147, 26, 0.05);
}
.notification-item[data-read="true"] {
opacity: 0.6;
}
.notification-item[data-level="success"] {
border-left: 3px solid #32CD32;
}
.notification-item[data-level="info"] {
border-left: 3px solid #00dfff;
}
.notification-item[data-level="warning"] {
border-left: 3px solid #ffd700;
}
.notification-item[data-level="error"] {
border-left: 3px solid #ff5555;
}
.notification-icon {
flex: 0 0 40px;
display: flex;
align-items: center;
justify-content: center;
font-size: 1.2rem;
}
.notification-item[data-level="success"] .notification-icon i {
color: #32CD32;
}
.notification-item[data-level="info"] .notification-icon i {
color: #00dfff;
}
.notification-item[data-level="warning"] .notification-icon i {
color: #ffd700;
}
.notification-item[data-level="error"] .notification-icon i {
color: #ff5555;
}
.notification-content {
flex: 1;
padding: 0 15px;
}
.notification-message {
margin-bottom: 5px;
word-break: break-word;
color: white;
}
.notification-meta {
font-size: 0.8rem;
color: #888;
display: flex;
gap: 15px;
}
.notification-category {
text-transform: uppercase;
font-size: 0.7rem;
color: #aaa;
}
.notification-actions {
flex: 0 0 80px;
display: flex;
align-items: center;
justify-content: flex-end;
gap: 5px;
}
.notification-actions button {
background: none;
border: none;
color: #888;
cursor: pointer;
transition: color 0.2s ease;
width: 30px;
height: 30px;
display: flex;
align-items: center;
justify-content: center;
border-radius: 3px;
}
.mark-read-button:hover {
color: #32CD32;
background-color: rgba(50, 205, 50, 0.1);
}
.delete-button:hover {
color: #ff5555;
background-color: rgba(255, 85, 85, 0.1);
}
/* Pagination */
.pagination-controls {
margin-top: 15px;
text-align: center;
}
.load-more-button {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
color: var(--primary-color);
padding: 5px 15px;
font-family: var(--terminal-font);
cursor: pointer;
transition: all 0.3s ease;
}
.load-more-button:hover {
background-color: rgba(247, 147, 26, 0.2);
}
.load-more-button:disabled {
opacity: 0.5;
cursor: not-allowed;
}
/* Notification Animation */
@keyframes fadeIn {
from {
opacity: 0;
transform: translateY(-10px);
}
to {
opacity: 1;
transform: translateY(0);
}
}
.notification-item {
animation: fadeIn 0.3s ease-out;
}
/* Mobile Responsiveness */
@media (max-width: 768px) {
.notification-actions {
flex-direction: column;
gap: 8px;
}
.action-button {
width: 100%; /* Full width on small screens */
padding: 8px 12px;
font-size: 1rem;
}
.notification-controls {
flex-direction: column;
align-items: stretch;
}
.filter-buttons {
overflow-x: auto;
padding-bottom: 5px;
margin-bottom: 5px;
white-space: nowrap;
display: flex;
flex-wrap: nowrap;
}
.notification-actions {
justify-content: flex-end;
}
.notification-item {
padding: 8px;
}
.notification-icon {
flex: 0 0 30px;
}
.notification-content {
padding: 0 8px;
}
.notification-actions {
flex: 0 0 60px;
}
}

View File

@ -70,7 +70,6 @@ body {
font-weight: bold;
font-size: 1.1rem; /* Match card header font size */
border-bottom: none;
text-shadow: 0 0 5px var(--primary-color);
animation: flicker 4s infinite; /* Add flicker animation from card headers */
font-family: var(--header-font); /* Use the same font variable */
padding: 0.3rem 0; /* Match card header padding */
@ -232,7 +231,6 @@ body {
#retro-terminal-bar #progress-text {
font-size: 16px;
color: var(--terminal-text);
text-shadow: 0 0 5px var(--terminal-text);
margin-top: 5px;
text-align: center;
position: relative;
@ -242,7 +240,6 @@ body {
#retro-terminal-bar #uptimeTimer {
font-size: 16px;
color: var(--terminal-text);
text-shadow: 0 0 5px var(--terminal-text);
text-align: center;
position: relative;
z-index: 2;
@ -354,7 +351,7 @@ body {
}
.terminal-title {
font-size: 14px;
font-size: 12px;
}
.terminal-dot {
@ -366,4 +363,76 @@ body {
padding: 6px 10px;
font-size: 12px;
}
}
}
/* Add these styles to retro-refresh.css to make the progress bar transitions smoother */
/* Smooth transition for progress bar width */
#retro-terminal-bar #bitcoin-progress-inner {
transition: width 0.3s ease-out;
}
/* Add a will-change property to optimize the animation */
#retro-terminal-bar .bitcoin-progress-container {
will-change: contents;
}
/* Smooth transition when changing from waiting state */
#retro-terminal-bar #bitcoin-progress-inner.waiting-for-update {
transition: width 0.3s ease-out, box-shadow 1s ease;
}
/* Ensure the scan line stays smooth during transitions */
#retro-terminal-bar .scan-line {
will-change: transform;
}
/* Improve mobile centering for collapsed system monitor */
@media (max-width: 767px) {
/* Target both possible selectors to ensure we catch the right one */
#retro-terminal-bar.collapsed,
.bitcoin-terminal.collapsed,
.retro-terminal-bar.collapsed,
div[id*="terminal"].collapsed {
left: 50% !important;
right: auto !important;
transform: translateX(-50%) !important;
width: auto !important;
max-width: 300px !important; /* Smaller max-width for mobile */
}
/* Ensure consistent height for minimized view */
.terminal-minimized {
height: 40px;
display: flex;
align-items: center;
}
}
/* Make the terminal draggable in desktop view */
@media (min-width: 768px) {
/* Target both possible selectors to handle all cases */
#bitcoin-terminal,
.bitcoin-terminal,
#retro-terminal-bar {
cursor: grab; /* Show a grab cursor to indicate draggability */
user-select: none; /* Prevent text selection during drag */
}
/* Change cursor during active dragging */
#bitcoin-terminal.dragging,
.bitcoin-terminal.dragging,
#retro-terminal-bar.dragging {
cursor: grabbing;
}
/* Style for drag handle in the header */
.terminal-header {
cursor: grab;
}
.terminal-header::before {
content: "⋮⋮"; /* Add drag indicator */
position: absolute;
left: 10px;
opacity: 0.5;
}
}

209
static/css/theme-toggle.css Normal file
View File

@ -0,0 +1,209 @@
/* Theme Toggle Button with positioning logic similar to topRightLink */
#themeToggle,
.theme-toggle-btn {
position: absolute; /* Change from fixed to absolute like topRightLink */
z-index: 1000;
background: transparent;
border-width: 1px;
border-style: solid;
font-family: 'VT323', monospace;
transition: all 0.3s ease;
cursor: pointer;
white-space: nowrap;
text-transform: uppercase;
outline: none;
display: flex;
align-items: center;
justify-content: center;
top: 30px; /* Match the top positioning of topRightLink */
left: 15px; /* Keep on left side */
}
/* Desktop specific styling */
@media screen and (min-width: 768px) {
#themeToggle,
.theme-toggle-btn {
padding: 6px 12px;
font-size: 14px;
border-radius: 3px;
letter-spacing: 0.5px;
}
/* Add theme icon for desktop view */
#themeToggle:before,
.theme-toggle-btn:before {
content: " ₿|🌊";
margin-right: 5px;
font-size: 14px;
}
/* Hover effects for desktop */
#themeToggle:hover,
.theme-toggle-btn:hover {
transform: translateY(-2px);
box-shadow: 0 4px 8px rgba(0, 0, 0, 0.3);
}
}
/* Mobile-specific styling */
@media screen and (max-width: 767px) {
#themeToggle,
.theme-toggle-btn {
padding: 10px;
font-size: 12px;
border-radius: 3px;
width: 40px;
height: 35px;
}
/* Use just icon for mobile to save space */
#themeToggle:before,
.theme-toggle-btn:before {
content: " ₿|🌊";
margin-right: 0;
font-size: 14px;
}
/* Hide text on mobile */
#themeToggle span,
.theme-toggle-btn span {
display: none;
}
/* Adjust position when in portrait mode on very small screens */
@media screen and (max-height: 500px) {
#themeToggle,
.theme-toggle-btn {
top: 5px;
left: 5px; /* Keep on left side */
width: 35px;
height: 35px;
font-size: 10px;
}
}
}
/* The rest of the CSS remains unchanged */
/* Active state for the button */
#themeToggle:active,
.theme-toggle-btn:active {
transform: translateY(1px);
box-shadow: 0 0 2px rgba(0, 0, 0, 0.3);
}
/* Bitcoin theme specific styling (orange) */
body:not(.deepsea-theme) #themeToggle,
body:not(.deepsea-theme) .theme-toggle-btn {
color: #f2a900;
border-color: #f2a900;
}
body:not(.deepsea-theme) #themeToggle:hover,
body:not(.deepsea-theme) .theme-toggle-btn:hover {
background-color: rgba(242, 169, 0, 0.1);
box-shadow: 0 4px 8px rgba(242, 169, 0, 0.3);
}
/* DeepSea theme specific styling (blue) */
body.deepsea-theme #themeToggle,
body.deepsea-theme .theme-toggle-btn {
color: #0088cc;
border-color: #0088cc;
}
body.deepsea-theme #themeToggle:hover,
body.deepsea-theme .theme-toggle-btn:hover {
background-color: rgba(0, 136, 204, 0.1);
box-shadow: 0 4px 8px rgba(0, 136, 204, 0.3);
}
/* Transition effect for smoother theme switching */
#themeToggle,
.theme-toggle-btn,
#themeToggle:before,
.theme-toggle-btn:before {
transition: all 0.3s ease;
}
/* Accessibility improvements */
#themeToggle:focus,
.theme-toggle-btn:focus {
box-shadow: 0 0 0 3px rgba(0, 0, 0, 0.3);
outline: none;
}
body:not(.deepsea-theme) #themeToggle:focus,
body:not(.deepsea-theme) .theme-toggle-btn:focus {
box-shadow: 0 0 0 3px rgba(242, 169, 0, 0.3);
}
body.deepsea-theme #themeToggle:focus,
body.deepsea-theme .theme-toggle-btn:focus {
box-shadow: 0 0 0 3px rgba(0, 136, 204, 0.3);
}
/* Add to your common.css or theme-toggle.css */
html.deepsea-theme {
--primary-color: #0088cc;
}
html.bitcoin-theme {
--primary-color: #f2a900;
}
/* Add these theme-specific loading styles */
#theme-loader {
position: fixed;
top: 0;
left: 0;
width: 100vw;
height: 100vh;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
z-index: 9999;
font-family: 'VT323', monospace;
}
html.bitcoin-theme #theme-loader {
background-color: #111111;
color: #f2a900;
}
html.deepsea-theme #theme-loader {
background-color: #0c141a;
color: #0088cc;
}
#loader-icon {
font-size: 48px;
margin-bottom: 20px;
animation: spin 2s infinite linear;
}
#loader-text {
font-size: 24px;
text-transform: uppercase;
letter-spacing: 1px;
}
@keyframes spin {
0% {
transform: rotate(0deg);
}
100% {
transform: rotate(360deg);
}
}
@keyframes pulse {
0%, 100% {
opacity: 0.8;
}
50% {
opacity: 1;
}
}

341
static/css/workers.css Normal file
View File

@ -0,0 +1,341 @@
/* Styles specific to the workers page */
/* Search and filter controls */
.controls-bar {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 15px;
flex-wrap: wrap;
gap: 10px;
}
.search-box {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
color: var(--text-color);
padding: 5px 10px;
font-family: var(--terminal-font);
min-width: 200px;
}
.search-box:focus {
outline: none;
box-shadow: 0 0 8px rgba(247, 147, 26, 0.5);
}
.filter-button {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
color: var(--primary-color);
padding: 5px 10px;
font-family: var(--terminal-font);
cursor: pointer;
}
.filter-button.active {
background-color: var(--primary-color);
color: var(--bg-color);
}
/* Worker grid for worker cards */
.worker-grid {
display: grid;
grid-template-columns: repeat(auto-fill, minmax(250px, 1fr));
gap: 10px;
margin-top: 10px;
}
/* Worker card styles */
.worker-card {
background-color: var(--bg-color);
border: 1px solid var(--primary-color);
box-shadow: 0 0 5px rgba(247, 147, 26, 0.3);
position: relative;
overflow: hidden;
padding: 10px;
height: 100%;
animation: fadeIn 0.3s ease;
}
@keyframes fadeIn {
from { opacity: 0; }
to { opacity: 1; }
}
.worker-card::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: repeating-linear-gradient(
0deg,
rgba(0, 0, 0, 0.05),
rgba(0, 0, 0, 0.05) 1px,
transparent 1px,
transparent 2px
);
pointer-events: none;
z-index: 1;
}
.worker-card-online {
border-color: #32CD32;
box-shadow: 0 0 8px rgba(50, 205, 50, 0.4);
}
.worker-card-offline {
border-color: #ff5555;
box-shadow: 0 0 8px rgba(255, 85, 85, 0.4);
}
.worker-name {
color: var(--primary-color);
font-weight: bold;
font-size: 1.2rem;
margin-bottom: 5px;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
z-index: 2;
position: relative;
}
.worker-stats {
margin-top: 8px;
font-size: 0.9rem;
z-index: 2;
position: relative;
}
.worker-stats-row {
display: flex;
justify-content: space-between;
margin-bottom: 4px;
}
.worker-stats-label {
color: #aaa;
}
.hashrate-bar {
height: 4px;
background: linear-gradient(90deg, #1137F5, #39ff14);
margin-top: 4px;
margin-bottom: 8px;
position: relative;
z-index: 2;
}
/* Worker badge */
.worker-type {
position: absolute;
top: 10px;
right: 10px;
font-size: 0.7rem;
background-color: rgba(0, 0, 0, 0.6);
border: 1px solid var(--primary-color);
color: var(--primary-color);
padding: 1px 5px;
z-index: 2;
}
/* Status badges */
.status-badge {
display: inline-block;
font-size: 0.8rem;
padding: 2px 8px;
border-radius: 3px;
z-index: 2;
position: relative;
}
.status-badge-online {
background-color: rgba(50, 205, 50, 0.2);
border: 1px solid #32CD32;
color: #32CD32;
}
.status-badge-offline {
background-color: rgba(255, 85, 85, 0.2);
border: 1px solid #ff5555;
color: #ff5555;
}
/* Stats bars */
.stats-bar-container {
width: 100%;
height: 4px;
background-color: rgba(255, 255, 255, 0.1);
margin-top: 2px;
margin-bottom: 5px;
position: relative;
z-index: 2;
}
.stats-bar {
height: 100%;
background: linear-gradient(90deg, #ff2d2d, #39ff14);
}
/* Summary stats in the header */
.summary-stats {
display: flex;
flex-wrap: wrap;
justify-content: space-around;
gap: 15px;
margin: 15px 0;
}
.summary-stat {
text-align: center;
min-width: 120px;
}
.summary-stat-value {
font-size: 1.6rem;
/* font-weight: bold; */
margin-bottom: 5px;
}
.summary-stat-label {
font-size: 0.9rem;
color: #aaa;
}
/* Worker count ring */
.worker-ring {
width: 90px;
height: 90px;
border-radius: 50%;
position: relative;
margin: 0 auto;
background: conic-gradient(
#32CD32 0% calc(var(--online-percent) * 100%),
#ff5555 calc(var(--online-percent) * 100%) 100%
);
display: flex;
align-items: center;
justify-content: center;
box-shadow: 0 0 15px rgba(247, 147, 26, 0.3);
}
.worker-ring-inner {
width: 70px;
height: 70px;
border-radius: 50%;
background-color: var(--bg-color);
display: flex;
align-items: center;
justify-content: center;
font-size: 1.2rem;
font-weight: bold;
color: var(--text-color);
}
/* Mini hashrate chart */
.mini-chart {
height: 40px;
width: 100%;
margin-top: 5px;
position: relative;
z-index: 2;
}
.loading-fade {
opacity: 0.6;
transition: opacity 0.3s ease;
}
/* Mobile responsiveness */
@media (max-width: 576px) {
.controls-bar {
flex-direction: column;
align-items: stretch;
}
.search-box {
width: 100%;
}
.filter-buttons {
display: flex;
justify-content: space-between;
}
.worker-grid {
grid-template-columns: 1fr;
}
.summary-stats {
flex-direction: column;
align-items: center;
}
.summary-stat {
width: 100%;
}
}
@media (max-width: 768px) {
/* Fix for "Made by" link collision with title */
#topRightLink {
position: static !important;
display: block !important;
text-align: right !important;
margin-bottom: 0.5rem !important;
margin-top: 0 !important;
font-size: 0.8rem !important;
}
/* Adjust heading for better mobile display */
h1 {
font-size: 20px !important;
line-height: 1.2 !important;
margin-top: 0.5rem !important;
padding-top: 0 !important;
}
/* Improve container padding for mobile */
.container-fluid {
padding-left: 0.5rem !important;
padding-right: 0.5rem !important;
}
/* Ensure top section has appropriate spacing */
.row.mb-3 {
margin-top: 0.5rem !important;
}
}
/* Add a more aggressive breakpoint for very small screens */
@media (max-width: 380px) {
#topRightLink {
margin-bottom: 0.75rem !important;
font-size: 0.7rem !important;
}
h1 {
font-size: 18px !important;
margin-bottom: 0.5rem !important;
}
/* Further reduce container padding */
.container-fluid {
padding-left: 0.3rem !important;
padding-right: 0.3rem !important;
}
}
/* Add extra padding at bottom of worker grid to avoid overlap */
.worker-grid {
margin-bottom: 120px;
}
/* Ensure summary stats have proper spacing on mobile */
@media (max-width: 576px) {
.summary-stats {
margin-bottom: 60px;
}
}

File diff suppressed because it is too large Load Diff

1040
static/js/blocks.js Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

503
static/js/notifications.js Normal file
View File

@ -0,0 +1,503 @@
"use strict";
// Global variables
let currentFilter = "all";
let currentOffset = 0;
const pageSize = 20;
let hasMoreNotifications = true;
let isLoading = false;
// Timezone configuration
let dashboardTimezone = 'America/Los_Angeles'; // Default
window.dashboardTimezone = dashboardTimezone; // Make it globally accessible
// Initialize when document is ready
$(document).ready(() => {
console.log("Notification page initializing...");
// Fetch timezone configuration
fetchTimezoneConfig();
// Set up filter buttons
$('.filter-button').click(function () {
$('.filter-button').removeClass('active');
$(this).addClass('active');
currentFilter = $(this).data('filter');
resetAndLoadNotifications();
});
// Set up action buttons
$('#mark-all-read').click(markAllAsRead);
$('#clear-read').click(clearReadNotifications);
$('#clear-all').click(clearAllNotifications);
$('#load-more').click(loadMoreNotifications);
// Initial load of notifications
loadNotifications();
// Start polling for unread count
startUnreadCountPolling();
// Initialize BitcoinMinuteRefresh if available
if (typeof BitcoinMinuteRefresh !== 'undefined' && BitcoinMinuteRefresh.initialize) {
BitcoinMinuteRefresh.initialize(refreshNotifications);
console.log("BitcoinMinuteRefresh initialized with refresh function");
}
// Start periodic update of notification timestamps every 30 seconds
setInterval(updateNotificationTimestamps, 30000);
});
// Fetch timezone configuration from server
function fetchTimezoneConfig() {
return fetch('/api/timezone')
.then(response => response.json())
.then(data => {
if (data && data.timezone) {
dashboardTimezone = data.timezone;
window.dashboardTimezone = dashboardTimezone; // Make it globally accessible
console.log(`Notifications page using timezone: ${dashboardTimezone}`);
// Store in localStorage for future use
try {
localStorage.setItem('dashboardTimezone', dashboardTimezone);
} catch (e) {
console.error("Error storing timezone in localStorage:", e);
}
// Update all timestamps with the new timezone
updateNotificationTimestamps();
return dashboardTimezone;
}
})
.catch(error => {
console.error('Error fetching timezone config:', error);
return null;
});
}
// Load notifications with current filter
function loadNotifications() {
if (isLoading) return;
isLoading = true;
showLoading();
const params = {
limit: pageSize,
offset: currentOffset
};
if (currentFilter !== "all") {
params.category = currentFilter;
}
$.ajax({
url: `/api/notifications?${$.param(params)}`,
method: "GET",
dataType: "json",
success: (data) => {
renderNotifications(data.notifications, currentOffset === 0);
updateUnreadBadge(data.unread_count);
// Update load more button state
hasMoreNotifications = data.notifications.length === pageSize;
$('#load-more').prop('disabled', !hasMoreNotifications);
isLoading = false;
},
error: (xhr, status, error) => {
console.error("Error loading notifications:", error);
showError("Failed to load notifications. Please try again.");
isLoading = false;
}
});
}
// Reset offset and load notifications
function resetAndLoadNotifications() {
currentOffset = 0;
loadNotifications();
}
// Load more notifications
function loadMoreNotifications() {
if (!hasMoreNotifications || isLoading) return;
currentOffset += pageSize;
loadNotifications();
}
// Refresh notifications (for periodic updates)
function refreshNotifications() {
// Only refresh if we're on the first page
if (currentOffset === 0) {
resetAndLoadNotifications();
} else {
// Just update the unread count
updateUnreadCount();
}
}
// This refreshes all timestamps on the page periodically
function updateNotificationTimestamps() {
$('.notification-item').each(function () {
const timestampStr = $(this).attr('data-timestamp');
if (timestampStr) {
try {
const timestamp = new Date(timestampStr);
// Update relative time
$(this).find('.notification-time').text(formatTimestamp(timestamp));
// Update full timestamp with configured timezone
if ($(this).find('.full-timestamp').length) {
const options = {
year: 'numeric',
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit',
second: '2-digit',
hour12: true,
timeZone: window.dashboardTimezone || 'America/Los_Angeles'
};
const fullTimestamp = timestamp.toLocaleString('en-US', options);
$(this).find('.full-timestamp').text(fullTimestamp);
}
} catch (e) {
console.error("Error updating timestamp:", e, timestampStr);
}
}
});
}
// Show loading indicator
function showLoading() {
if (currentOffset === 0) {
// First page load, show loading message
$('#notifications-container').html('<div class="loading-message">Loading notifications<span class="terminal-cursor"></span></div>');
} else {
// Pagination load, show loading below
$('#load-more').prop('disabled', true).text('Loading...');
}
}
// Show error message
function showError(message) {
$('#notifications-container').html(`<div class="error-message">${message}</div>`);
$('#load-more').hide();
}
// Render notifications in the container
function renderNotifications(notifications, isFirstPage) {
const container = $('#notifications-container');
// If first page and no notifications
if (isFirstPage && (!notifications || notifications.length === 0)) {
container.html($('#empty-template').html());
$('#load-more').hide();
return;
}
// If first page, clear container
if (isFirstPage) {
container.empty();
}
// Render each notification
notifications.forEach(notification => {
const notificationElement = createNotificationElement(notification);
container.append(notificationElement);
});
// Show/hide load more button
$('#load-more').show().prop('disabled', !hasMoreNotifications);
}
// Create notification element from template
function createNotificationElement(notification) {
const template = $('#notification-template').html();
const element = $(template);
// Set data attributes
element.attr('data-id', notification.id)
.attr('data-level', notification.level)
.attr('data-category', notification.category)
.attr('data-read', notification.read)
.attr('data-timestamp', notification.timestamp);
// Set icon based on level
const iconElement = element.find('.notification-icon i');
switch (notification.level) {
case 'success':
iconElement.addClass('fa-check-circle');
break;
case 'info':
iconElement.addClass('fa-info-circle');
break;
case 'warning':
iconElement.addClass('fa-exclamation-triangle');
break;
case 'error':
iconElement.addClass('fa-times-circle');
break;
default:
iconElement.addClass('fa-bell');
}
// Important: Do not append "Z" here, as that can cause timezone issues
// Create a date object from the notification timestamp
let notificationDate;
try {
// Parse the timestamp directly without modifications
notificationDate = new Date(notification.timestamp);
// Validate the date object - if invalid, try alternative approach
if (isNaN(notificationDate.getTime())) {
console.warn("Invalid date from notification timestamp, trying alternative format");
// Try adding Z to make it explicit UTC if not already ISO format
if (!notification.timestamp.endsWith('Z') && !notification.timestamp.includes('+')) {
notificationDate = new Date(notification.timestamp + 'Z');
}
}
} catch (e) {
console.error("Error parsing notification date:", e);
notificationDate = new Date(); // Fallback to current date
}
// Format the timestamp using the configured timezone
const options = {
year: 'numeric',
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit',
second: '2-digit',
hour12: true,
timeZone: window.dashboardTimezone || 'America/Los_Angeles'
};
// Format full timestamp with configured timezone
let fullTimestamp;
try {
fullTimestamp = notificationDate.toLocaleString('en-US', options);
} catch (e) {
console.error("Error formatting timestamp with timezone:", e);
fullTimestamp = notificationDate.toLocaleString('en-US'); // Fallback without timezone
}
// Append the message and formatted timestamp
const messageWithTimestamp = `${notification.message}<br><span class="full-timestamp">${fullTimestamp}</span>`;
element.find('.notification-message').html(messageWithTimestamp);
// Set metadata for relative time display
element.find('.notification-time').text(formatTimestamp(notificationDate));
element.find('.notification-category').text(notification.category);
// Set up action buttons
element.find('.mark-read-button').on('click', (e) => {
e.stopPropagation();
markAsRead(notification.id);
});
element.find('.delete-button').on('click', (e) => {
e.stopPropagation();
deleteNotification(notification.id);
});
// Hide mark as read button if already read
if (notification.read) {
element.find('.mark-read-button').hide();
}
return element;
}
function formatTimestamp(timestamp) {
// Ensure we have a valid date object
let dateObj = timestamp;
if (!(timestamp instanceof Date) || isNaN(timestamp.getTime())) {
try {
dateObj = new Date(timestamp);
} catch (e) {
console.error("Invalid timestamp in formatTimestamp:", e);
return "unknown time";
}
}
// Calculate time difference in local timezone context
const now = new Date();
const diffMs = now - dateObj;
const diffSec = Math.floor(diffMs / 1000);
const diffMin = Math.floor(diffSec / 60);
const diffHour = Math.floor(diffMin / 60);
const diffDay = Math.floor(diffHour / 24);
if (diffSec < 60) {
return "just now";
} else if (diffMin < 60) {
return `${diffMin}m ago`;
} else if (diffHour < 24) {
return `${diffHour}h ago`;
} else if (diffDay < 30) {
return `${diffDay}d ago`;
} else {
// Format as date for older notifications using configured timezone
const options = {
year: 'numeric',
month: 'short',
day: 'numeric',
timeZone: window.dashboardTimezone || 'America/Los_Angeles'
};
return dateObj.toLocaleDateString('en-US', options);
}
}
// Mark a notification as read
function markAsRead(notificationId) {
$.ajax({
url: "/api/notifications/mark_read",
method: "POST",
data: JSON.stringify({ notification_id: notificationId }),
contentType: "application/json",
success: (data) => {
// Update UI
$(`[data-id="${notificationId}"]`).attr('data-read', 'true');
$(`[data-id="${notificationId}"]`).find('.mark-read-button').hide();
// Update unread badge
updateUnreadBadge(data.unread_count);
},
error: (xhr, status, error) => {
console.error("Error marking notification as read:", error);
}
});
}
// Mark all notifications as read
function markAllAsRead() {
$.ajax({
url: "/api/notifications/mark_read",
method: "POST",
data: JSON.stringify({}),
contentType: "application/json",
success: (data) => {
// Update UI
$('.notification-item').attr('data-read', 'true');
$('.mark-read-button').hide();
// Update unread badge
updateUnreadBadge(0);
},
error: (xhr, status, error) => {
console.error("Error marking all notifications as read:", error);
}
});
}
// Delete a notification
function deleteNotification(notificationId) {
$.ajax({
url: "/api/notifications/delete",
method: "POST",
data: JSON.stringify({ notification_id: notificationId }),
contentType: "application/json",
success: (data) => {
// Remove from UI with animation
$(`[data-id="${notificationId}"]`).fadeOut(300, function () {
$(this).remove();
// Check if container is empty now
if ($('#notifications-container').children().length === 0) {
$('#notifications-container').html($('#empty-template').html());
$('#load-more').hide();
}
});
// Update unread badge
updateUnreadBadge(data.unread_count);
},
error: (xhr, status, error) => {
console.error("Error deleting notification:", error);
}
});
}
// Clear read notifications
function clearReadNotifications() {
if (!confirm("Are you sure you want to clear all read notifications?")) {
return;
}
$.ajax({
url: "/api/notifications/clear",
method: "POST",
data: JSON.stringify({
// Special parameter to clear only read notifications
read_only: true
}),
contentType: "application/json",
success: () => {
// Reload notifications
resetAndLoadNotifications();
},
error: (xhr, status, error) => {
console.error("Error clearing read notifications:", error);
}
});
}
// Clear all notifications
function clearAllNotifications() {
if (!confirm("Are you sure you want to clear ALL notifications? This cannot be undone.")) {
return;
}
$.ajax({
url: "/api/notifications/clear",
method: "POST",
data: JSON.stringify({}),
contentType: "application/json",
success: () => {
// Reload notifications
resetAndLoadNotifications();
},
error: (xhr, status, error) => {
console.error("Error clearing all notifications:", error);
}
});
}
// Update unread badge
function updateUnreadBadge(count) {
$('#unread-badge').text(count);
// Add special styling if unread
if (count > 0) {
$('#unread-badge').addClass('has-unread');
} else {
$('#unread-badge').removeClass('has-unread');
}
}
// Update unread count from API
function updateUnreadCount() {
$.ajax({
url: "/api/notifications/unread_count",
method: "GET",
success: (data) => {
updateUnreadBadge(data.unread_count);
},
error: (xhr, status, error) => {
console.error("Error updating unread count:", error);
}
});
}
// Start polling for unread count
function startUnreadCountPolling() {
// Update every 30 seconds
setInterval(updateUnreadCount, 30000);
}

View File

@ -1,238 +0,0 @@
// This script integrates the retro floating refresh bar
// with the existing dashboard and workers page functionality
(function() {
// Wait for DOM to be ready
document.addEventListener('DOMContentLoaded', function() {
// Create the retro terminal bar if it doesn't exist yet
if (!document.getElementById('retro-terminal-bar')) {
createRetroTerminalBar();
}
// Hide the original refresh container
const originalRefreshUptime = document.getElementById('refreshUptime');
if (originalRefreshUptime) {
originalRefreshUptime.style.visibility = 'hidden';
originalRefreshUptime.style.height = '0';
originalRefreshUptime.style.overflow = 'hidden';
// Important: We keep the original elements and just hide them
// This ensures all existing JavaScript functions still work
}
// Add extra space at the bottom of the page to prevent the floating bar from covering content
const extraSpace = document.createElement('div');
extraSpace.style.height = '100px';
document.body.appendChild(extraSpace);
});
// Function to create the retro terminal bar
function createRetroTerminalBar() {
// Get the HTML content from the shared CSS/HTML
const html = `
<div id="retro-terminal-bar">
<div class="terminal-header">
<div class="terminal-title">SYSTEM MONITOR v0.1</div>
<div class="terminal-controls">
<div class="terminal-dot minimize" title="Minimize" onclick="toggleTerminal()"></div>
<div class="terminal-dot close" title="Close" onclick="hideTerminal()"></div>
</div>
</div>
<div class="terminal-content">
<div class="status-indicators">
<div class="status-indicator">
<div class="status-dot connected"></div>
<span>LIVE</span>
</div>
<div class="status-indicator">
<span id="data-refresh-time">00:00:00</span>
</div>
</div>
<div id="refreshContainer">
<!-- Enhanced progress bar with tick marks -->
<div class="bitcoin-progress-container">
<div id="bitcoin-progress-inner">
<div class="scan-line"></div>
</div>
<div class="progress-ticks">
<span>0s</span>
<span>15s</span>
<span>30s</span>
<span>45s</span>
<span>60s</span>
</div>
<!-- Add tick marks every 5 seconds -->
<div class="tick-mark major" style="left: 0%"></div>
<div class="tick-mark" style="left: 8.33%"></div>
<div class="tick-mark" style="left: 16.67%"></div>
<div class="tick-mark major" style="left: 25%"></div>
<div class="tick-mark" style="left: 33.33%"></div>
<div class="tick-mark" style="left: 41.67%"></div>
<div class="tick-mark major" style="left: 50%"></div>
<div class="tick-mark" style="left: 58.33%"></div>
<div class="tick-mark" style="left: 66.67%"></div>
<div class="tick-mark major" style="left: 75%"></div>
<div class="tick-mark" style="left: 83.33%"></div>
<div class="tick-mark" style="left: 91.67%"></div>
<div class="tick-mark major" style="left: 100%"></div>
</div>
</div>
<div id="progress-text">60s to next update</div>
<div id="uptimeTimer"><strong>Uptime:</strong> 0h 0m 0s</div>
</div>
</div>
`;
// Create a container for the HTML
const container = document.createElement('div');
container.innerHTML = html;
// Append to the body
document.body.appendChild(container.firstElementChild);
// Start the clock update
updateTerminalClock();
setInterval(updateTerminalClock, 1000);
// Check if terminal should be collapsed based on previous state
const isCollapsed = localStorage.getItem('terminalCollapsed') === 'true';
if (isCollapsed) {
document.getElementById('retro-terminal-bar').classList.add('collapsed');
}
}
// Function to update the terminal clock
function updateTerminalClock() {
const clockElement = document.getElementById('data-refresh-time');
if (clockElement) {
const now = new Date();
const hours = String(now.getHours()).padStart(2, '0');
const minutes = String(now.getMinutes()).padStart(2, '0');
const seconds = String(now.getSeconds()).padStart(2, '0');
clockElement.textContent = `${hours}:${minutes}:${seconds}`;
}
}
// Expose these functions globally for the onclick handlers
window.toggleTerminal = function() {
const terminal = document.getElementById('retro-terminal-bar');
terminal.classList.toggle('collapsed');
// Store state in localStorage
localStorage.setItem('terminalCollapsed', terminal.classList.contains('collapsed'));
};
window.hideTerminal = function() {
document.getElementById('retro-terminal-bar').style.display = 'none';
// Create a show button that appears at the bottom right
const showButton = document.createElement('button');
showButton.id = 'show-terminal-button';
showButton.textContent = 'Show Monitor';
showButton.style.position = 'fixed';
showButton.style.bottom = '10px';
showButton.style.right = '10px';
showButton.style.zIndex = '1000';
showButton.style.backgroundColor = '#f7931a';
showButton.style.color = '#000';
showButton.style.border = 'none';
showButton.style.padding = '8px 12px';
showButton.style.cursor = 'pointer';
showButton.style.fontFamily = "'VT323', monospace";
showButton.style.fontSize = '14px';
showButton.onclick = function() {
document.getElementById('retro-terminal-bar').style.display = 'block';
this.remove();
};
document.body.appendChild(showButton);
};
// Redirect original progress bar updates to our new floating bar
// This Observer will listen for changes to the original #bitcoin-progress-inner
// and replicate them to our new floating bar version
const initProgressObserver = function() {
// Setup a MutationObserver to watch for style changes on the original progress bar
const originalProgressBar = document.querySelector('#refreshUptime #bitcoin-progress-inner');
const newProgressBar = document.querySelector('#retro-terminal-bar #bitcoin-progress-inner');
if (originalProgressBar && newProgressBar) {
const observer = new MutationObserver(function(mutations) {
mutations.forEach(function(mutation) {
if (mutation.attributeName === 'style') {
// Get the width from the original progress bar
const width = originalProgressBar.style.width;
if (width) {
// Apply it to our new progress bar
newProgressBar.style.width = width;
// Also copy any classes (like glow-effect)
if (originalProgressBar.classList.contains('glow-effect') &&
!newProgressBar.classList.contains('glow-effect')) {
newProgressBar.classList.add('glow-effect');
} else if (!originalProgressBar.classList.contains('glow-effect') &&
newProgressBar.classList.contains('glow-effect')) {
newProgressBar.classList.remove('glow-effect');
}
// Copy waiting-for-update class
if (originalProgressBar.classList.contains('waiting-for-update') &&
!newProgressBar.classList.contains('waiting-for-update')) {
newProgressBar.classList.add('waiting-for-update');
} else if (!originalProgressBar.classList.contains('waiting-for-update') &&
newProgressBar.classList.contains('waiting-for-update')) {
newProgressBar.classList.remove('waiting-for-update');
}
}
}
});
});
// Start observing
observer.observe(originalProgressBar, { attributes: true });
}
// Also watch for changes to the progress text
const originalProgressText = document.querySelector('#refreshUptime #progress-text');
const newProgressText = document.querySelector('#retro-terminal-bar #progress-text');
if (originalProgressText && newProgressText) {
const textObserver = new MutationObserver(function(mutations) {
mutations.forEach(function(mutation) {
if (mutation.type === 'childList') {
// Update the text in our new bar
newProgressText.textContent = originalProgressText.textContent;
}
});
});
// Start observing
textObserver.observe(originalProgressText, { childList: true, subtree: true });
}
// Watch for changes to the uptime timer
const originalUptimeTimer = document.querySelector('#refreshUptime #uptimeTimer');
const newUptimeTimer = document.querySelector('#retro-terminal-bar #uptimeTimer');
if (originalUptimeTimer && newUptimeTimer) {
const uptimeObserver = new MutationObserver(function(mutations) {
mutations.forEach(function(mutation) {
if (mutation.type === 'childList') {
// Update the text in our new bar
newUptimeTimer.innerHTML = originalUptimeTimer.innerHTML;
}
});
});
// Start observing
uptimeObserver.observe(originalUptimeTimer, { childList: true, subtree: true });
}
};
// Start the observer once the page is fully loaded
window.addEventListener('load', function() {
// Give a short delay to ensure all elements are rendered
setTimeout(initProgressObserver, 500);
});
})();

440
static/js/theme.js Normal file
View File

@ -0,0 +1,440 @@
// Add this flag at the top of your file, outside the function
let isApplyingTheme = false;
// Bitcoin Orange theme (default)
const BITCOIN_THEME = {
PRIMARY: '#f2a900',
PRIMARY_RGB: '242, 169, 0',
SHARED: {
GREEN: '#32CD32',
RED: '#ff5555',
YELLOW: '#ffd700'
},
CHART: {
GRADIENT_START: '#f2a900',
GRADIENT_END: 'rgba(242, 169, 0, 0.2)',
ANNOTATION: '#ffd700'
}
};
// DeepSea theme (blue alternative)
const DEEPSEA_THEME = {
PRIMARY: '#0088cc',
PRIMARY_RGB: '0, 136, 204',
SHARED: {
GREEN: '#32CD32',
RED: '#ff5555',
YELLOW: '#ffd700'
},
CHART: {
GRADIENT_START: '#0088cc',
GRADIENT_END: 'rgba(0, 136, 204, 0.2)',
ANNOTATION: '#00b3ff'
}
};
// Global theme constants
const THEME = {
BITCOIN: BITCOIN_THEME,
DEEPSEA: DEEPSEA_THEME,
SHARED: BITCOIN_THEME.SHARED
};
// Function to get the current theme based on localStorage setting
function getCurrentTheme() {
const useDeepSea = localStorage.getItem('useDeepSeaTheme') === 'true';
return useDeepSea ? DEEPSEA_THEME : BITCOIN_THEME;
}
// Make globals available
window.THEME = THEME;
window.getCurrentTheme = getCurrentTheme;
// Use window-scoped variable to prevent conflicts
window.themeProcessing = false;
// Fixed applyDeepSeaTheme function with recursion protection
function applyDeepSeaTheme() {
// Check if we're already applying the theme to prevent recursion
if (window.themeProcessing) {
console.log("Theme application already in progress, avoiding recursion");
return;
}
// Set the guard flag
isApplyingTheme = true;
try {
console.log("Applying DeepSea theme...");
// Create or update CSS variables for the DeepSea theme
const styleElement = document.createElement('style');
styleElement.id = 'deepSeaThemeStyles'; // Give it an ID so we can check if it exists
// Enhanced CSS with clean, organized structure
styleElement.textContent = `
/* Base theme variables */
:root {
--primary-color: #0088cc;
--primary-color-rgb: 0, 136, 204;
--accent-color: #00b3ff;
--bg-gradient: linear-gradient(135deg, #0a0a0a, #131b20);
}
/* Card styling */
.card {
border: 1px solid var(--primary-color) !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.3) !important;
}
.card-header, .card > .card-header {
background: linear-gradient(to right, var(--primary-color), #006699) !important;
border-bottom: 1px solid var(--primary-color) !important;
color: #fff !important;
}
/* Navigation */
.nav-link {
border: 1px solid var(--primary-color) !important;
color: var(--primary-color) !important;
}
.nav-link:hover, .nav-link.active {
background-color: var(--primary-color) !important;
color: #fff !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.5) !important;
}
/* Interface elements */
#terminal-cursor {
background-color: var(--primary-color) !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.8) !important;
}
#lastUpdated {
color: var(--primary-color) !important;
}
h1, .text-center h1 {
color: var(--primary-color) !important;
}
.nav-badge {
background-color: var(--primary-color) !important;
}
/* Bitcoin progress elements */
.bitcoin-progress-inner {
background: linear-gradient(90deg, var(--primary-color), var(--accent-color)) !important;
}
.bitcoin-progress-container {
border: 1px solid var(--primary-color) !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.5) !important;
}
/* Theme toggle button styling */
#themeToggle, button.theme-toggle, .toggle-theme-btn {
background: transparent !important;
border: 1px solid var(--primary-color) !important;
color: var(--primary-color) !important;
transition: all 0.3s ease !important;
}
#themeToggle:hover, button.theme-toggle:hover, .toggle-theme-btn:hover {
background-color: rgba(var(--primary-color-rgb), 0.1) !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.3) !important;
}
/* ===== SPECIAL CASE FIXES ===== */
/* Pool hashrate - always white */
[id^="pool_"] {
color: #ffffff !important;
}
/* Block page elements */
.stat-item strong,
.block-height,
.block-detail-title {
color: var(--primary-color) !important;
}
/* Block inputs and button styles */
.block-input:focus {
outline: none !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.5) !important;
}
.block-button:hover {
background-color: var(--primary-color) !important;
color: #000 !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.5) !important;
}
/* Notification page elements */
.filter-button.active {
background-color: var(--primary-color) !important;
color: #000 !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.5) !important;
}
.filter-button:hover,
.action-button:hover:not(.danger),
.load-more-button:hover {
background-color: rgba(var(--primary-color-rgb), 0.2) !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.3) !important;
}
/* Block cards and modals */
.block-card:hover {
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.5) !important;
transform: translateY(-2px);
}
.block-modal-content {
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.5) !important;
}
.block-modal-close:hover,
.block-modal-close:focus {
color: var(--accent-color) !important;
}
/* ===== COLOR CATEGORIES ===== */
/* YELLOW - SATOSHI EARNINGS & BTC PRICE */
[id$="_sats"],
#btc_price,
.metric-value[id$="_sats"],
.est_time_to_payout:not(.green):not(.red) {
color: #ffd700 !important;
}
/* GREEN - POSITIVE USD VALUES */
.metric-value.green,
span.green,
#daily_revenue:not([style*="color: #ff"]),
#monthly_profit_usd:not([style*="color: #ff"]),
#daily_profit_usd:not([style*="color: #ff"]),
.status-green,
#pool_luck.very-lucky,
#pool_luck.lucky {
color: #32CD32 !important;
}
.online-dot {
background: #32CD32 !important;
box-shadow: 0 0 10px #32CD32, 0 0 10px #32CD32 !important;
}
/* Light green for "lucky" status */
#pool_luck.lucky {
color: #90EE90 !important;
}
/* NORMAL LUCK - KHAKI */
#pool_luck.normal-luck {
color: #F0E68C !important;
}
/* RED - NEGATIVE VALUES & WARNINGS */
.metric-value.red,
span.red,
.status-red,
#daily_power_cost,
#pool_luck.unlucky {
color: #ff5555 !important;
}
.offline-dot {
background: #ff5555 !important;
box-shadow: 0 0 10px #ff5555, 0 0 10px #ff5555 !important;
}
/* WHITE - NETWORK STATS & WORKER DATA */
#block_number,
#difficulty,
#network_hashrate,
#pool_fees_percentage,
#workers_hashing,
#last_share,
#blocks_found,
#last_block_height,
#hashrate_24hr,
#hashrate_3hr,
#hashrate_10min,
#hashrate_60sec {
color: #ffffff !important;
}
/* CYAN - TIME AGO IN LAST BLOCK */
#last_block_time {
color: #00ffff !important;
}
/* CONGRATULATIONS MESSAGE */
#congratsMessage {
background: var(--primary-color) !important;
box-shadow: 0 0 10px rgba(var(--primary-color-rgb), 0.7) !important;
}
/* ANIMATIONS */
@keyframes waitingPulse {
0%, 100% { box-shadow: 0 0 10px var(--primary-color), 0 0 10px var(--primary-color) !important; opacity: 0.8; }
50% { box-shadow: 0 0 10px var(--primary-color), 0 0 10px var(--primary-color) !important; opacity: 1; }
}
@keyframes glow {
0%, 100% { box-shadow: 0 0 10px var(--primary-color), 0 0 10px var(--primary-color) !important; }
50% { box-shadow: 0 0 10px var(--primary-color), 0 0 10px var(--primary-color) !important; }
}
`;
// Check if our style element already exists
const existingStyle = document.getElementById('deepSeaThemeStyles');
if (existingStyle) {
existingStyle.parentNode.removeChild(existingStyle);
}
// Add our new style element to the head
document.head.appendChild(styleElement);
// Update page title
document.title = document.title.replace("BTC-OS", "DeepSea");
document.title = document.title.replace("Bitcoin", "DeepSea");
// Update header text
const headerElement = document.querySelector('h1');
if (headerElement) {
headerElement.innerHTML = headerElement.innerHTML.replace("BTC-OS", "DeepSea");
headerElement.innerHTML = headerElement.innerHTML.replace("BITCOIN", "DEEPSEA");
}
// Update theme toggle button
const themeToggle = document.getElementById('themeToggle');
if (themeToggle) {
themeToggle.style.borderColor = '#0088cc';
themeToggle.style.color = '#0088cc';
}
console.log("DeepSea theme applied with color adjustments");
} finally {
// Reset the guard flag when done, even if there's an error
setTimeout(() => { isApplyingTheme = false; }, 100);
}
}
// Make the function accessible globally
window.applyDeepSeaTheme = applyDeepSeaTheme;
// Toggle theme with hard page refresh
function toggleTheme() {
const useDeepSea = localStorage.getItem('useDeepSeaTheme') !== 'true';
// Save the new theme preference
saveThemePreference(useDeepSea);
// Show a themed loading message
const loadingMessage = document.createElement('div');
loadingMessage.id = 'theme-loader';
const icon = document.createElement('div');
icon.id = 'loader-icon';
icon.innerHTML = useDeepSea ? '🌊' : '₿';
const text = document.createElement('div');
text.id = 'loader-text';
text.textContent = 'Applying ' + (useDeepSea ? 'DeepSea' : 'Bitcoin') + ' Theme';
loadingMessage.appendChild(icon);
loadingMessage.appendChild(text);
// Apply immediate styling
loadingMessage.style.position = 'fixed';
loadingMessage.style.top = '0';
loadingMessage.style.left = '0';
loadingMessage.style.width = '100%';
loadingMessage.style.height = '100%';
loadingMessage.style.backgroundColor = useDeepSea ? '#0c141a' : '#111111';
loadingMessage.style.color = useDeepSea ? '#0088cc' : '#f2a900';
loadingMessage.style.display = 'flex';
loadingMessage.style.flexDirection = 'column';
loadingMessage.style.justifyContent = 'center';
loadingMessage.style.alignItems = 'center';
loadingMessage.style.zIndex = '9999';
loadingMessage.style.fontFamily = "'VT323', monospace";
document.body.appendChild(loadingMessage);
// Short delay before refreshing
setTimeout(() => {
// Hard reload the page
window.location.reload();
}, 500);
}
// Set theme preference to localStorage
function saveThemePreference(useDeepSea) {
try {
localStorage.setItem('useDeepSeaTheme', useDeepSea);
} catch (e) {
console.error("Error saving theme preference:", e);
}
}
// Check if this is the first startup by checking for the "firstStartup" flag
function isFirstStartup() {
return localStorage.getItem('hasStartedBefore') !== 'true';
}
// Mark that the app has started before
function markAppStarted() {
try {
localStorage.setItem('hasStartedBefore', 'true');
} catch (e) {
console.error("Error marking app as started:", e);
}
}
// Initialize DeepSea as default on first startup
function initializeDefaultTheme() {
if (isFirstStartup()) {
console.log("First startup detected, setting DeepSea as default theme");
saveThemePreference(true); // Set DeepSea theme as default (true)
markAppStarted();
return true;
}
return false;
}
// Check for theme preference in localStorage
function loadThemePreference() {
try {
// Check if it's first startup - if so, set DeepSea as default
const isFirstTime = initializeDefaultTheme();
// Get theme preference from localStorage
const themePreference = localStorage.getItem('useDeepSeaTheme');
// Apply theme based on preference
if (themePreference === 'true' || isFirstTime) {
applyDeepSeaTheme();
} else {
// Make sure the toggle button is styled correctly for Bitcoin theme
const themeToggle = document.getElementById('themeToggle');
if (themeToggle) {
themeToggle.style.borderColor = '#f2a900';
themeToggle.style.color = '#f2a900';
}
}
} catch (e) {
console.error("Error loading theme preference:", e);
}
}
// Apply theme on page load
document.addEventListener('DOMContentLoaded', loadThemePreference);
// For pages that load content dynamically, also check when the window loads
window.addEventListener('load', loadThemePreference);

View File

@ -3,11 +3,9 @@
// Global variables for workers dashboard
let workerData = null;
let refreshTimer;
let pageLoadTime = Date.now();
let currentProgress = 0;
const PROGRESS_MAX = 60; // 60 seconds for a complete cycle
let lastUpdateTime = Date.now();
let filterState = {
const pageLoadTime = Date.now();
let lastManualRefreshTime = 0;
const filterState = {
currentFilter: 'all',
searchTerm: ''
};
@ -19,189 +17,129 @@ let serverTimeOffset = 0;
let serverStartTime = null;
// New variable to track custom refresh timing
let lastManualRefreshTime = 0;
const MIN_REFRESH_INTERVAL = 10000; // Minimum 10 seconds between refreshes
// Initialize the page
$(document).ready(function() {
// Set up initial UI
initializePage();
// Get server time for uptime calculation
updateServerTime();
// Set up refresh synchronization with main dashboard
setupRefreshSync();
// Fetch worker data immediately on page load
fetchWorkerData();
// Set up refresh timer
setInterval(updateProgressBar, 1000);
// Set up uptime timer - synced with main dashboard
setInterval(updateUptime, 1000);
// Start server time polling - same as main dashboard
setInterval(updateServerTime, 30000);
// Auto-refresh worker data - aligned with main dashboard if possible
setInterval(function() {
// Check if it's been at least PROGRESS_MAX seconds since last update
const timeSinceLastUpdate = Date.now() - lastUpdateTime;
if (timeSinceLastUpdate >= PROGRESS_MAX * 1000) {
// Check if there was a recent manual refresh
const timeSinceManualRefresh = Date.now() - lastManualRefreshTime;
if (timeSinceManualRefresh >= MIN_REFRESH_INTERVAL) {
console.log("Auto-refresh triggered after time interval");
fetchWorkerData();
}
// Hashrate Normalization Utilities
// Helper function to normalize hashrate to TH/s for consistent graphing
function normalizeHashrate(value, unit = 'th/s') {
if (!value || isNaN(value)) return 0;
unit = unit.toLowerCase();
const unitConversion = {
'ph/s': 1000,
'eh/s': 1000000,
'gh/s': 1 / 1000,
'mh/s': 1 / 1000000,
'kh/s': 1 / 1000000000,
'h/s': 1 / 1000000000000
};
return unitConversion[unit] !== undefined ? value * unitConversion[unit] : value;
}
// Helper function to format hashrate values for display
function formatHashrateForDisplay(value, unit) {
if (isNaN(value) || value === null || value === undefined) return "N/A";
const normalizedValue = unit ? normalizeHashrate(value, unit) : value;
const unitRanges = [
{ threshold: 1000000, unit: 'EH/s', divisor: 1000000 },
{ threshold: 1000, unit: 'PH/s', divisor: 1000 },
{ threshold: 1, unit: 'TH/s', divisor: 1 },
{ threshold: 0.001, unit: 'GH/s', divisor: 1 / 1000 },
{ threshold: 0, unit: 'MH/s', divisor: 1 / 1000000 }
];
for (const range of unitRanges) {
if (normalizedValue >= range.threshold) {
return (normalizedValue / range.divisor).toFixed(2) + ' ' + range.unit;
}
}, 10000); // Check every 10 seconds to align better with main dashboard
// Set up filter button click handlers
$('.filter-button').click(function() {
}
return (normalizedValue * 1000000).toFixed(2) + ' MH/s';
}
// Initialize the page
$(document).ready(function () {
console.log("Worker page initializing...");
initNotificationBadge();
initializePage();
updateServerTime();
window.manualRefresh = fetchWorkerData;
setTimeout(() => {
if (typeof BitcoinMinuteRefresh !== 'undefined' && BitcoinMinuteRefresh.initialize) {
BitcoinMinuteRefresh.initialize(window.manualRefresh);
console.log("BitcoinMinuteRefresh initialized with refresh function");
} else {
console.warn("BitcoinMinuteRefresh not available");
}
}, 500);
fetchWorkerData();
$('.filter-button').click(function () {
$('.filter-button').removeClass('active');
$(this).addClass('active');
filterState.currentFilter = $(this).data('filter');
filterWorkers();
});
// Set up search input handler
$('#worker-search').on('input', function() {
$('#worker-search').on('input', function () {
filterState.searchTerm = $(this).val().toLowerCase();
filterWorkers();
});
});
// Set up refresh synchronization with main dashboard
function setupRefreshSync() {
// Listen for storage events (triggered by main dashboard)
window.addEventListener('storage', function(event) {
// Check if this is our dashboard refresh event
if (event.key === 'dashboardRefreshEvent') {
console.log("Detected dashboard refresh event");
// Prevent too frequent refreshes
const now = Date.now();
const timeSinceLastRefresh = now - lastUpdateTime;
if (timeSinceLastRefresh >= MIN_REFRESH_INTERVAL) {
console.log("Syncing refresh with main dashboard");
// Reset progress bar and immediately fetch
resetProgressBar();
// Refresh the worker data
fetchWorkerData();
} else {
console.log("Skipping too-frequent refresh", timeSinceLastRefresh);
// Just reset the progress bar to match main dashboard
resetProgressBar();
}
}
});
// On page load, check if we should align with main dashboard timing
// Load timezone setting early
(function loadTimezoneEarly() {
// First try to get from localStorage for instant access
try {
const lastDashboardRefresh = localStorage.getItem('dashboardRefreshTime');
if (lastDashboardRefresh) {
const lastRefreshTime = parseInt(lastDashboardRefresh);
const timeSinceLastDashboardRefresh = Date.now() - lastRefreshTime;
// If main dashboard refreshed recently, adjust our timer
if (timeSinceLastDashboardRefresh < PROGRESS_MAX * 1000) {
console.log("Adjusting timer to align with main dashboard");
currentProgress = Math.floor(timeSinceLastDashboardRefresh / 1000);
updateProgressBar(currentProgress);
// Calculate when next update will happen (roughly 60 seconds from last dashboard refresh)
const timeUntilNextRefresh = (PROGRESS_MAX * 1000) - timeSinceLastDashboardRefresh;
// Schedule a one-time check near the expected refresh time
if (timeUntilNextRefresh > 0) {
console.log(`Scheduling coordinated refresh in ${Math.floor(timeUntilNextRefresh/1000)} seconds`);
setTimeout(function() {
// Check if a refresh happened in the last few seconds via localStorage event
const newLastRefresh = parseInt(localStorage.getItem('dashboardRefreshTime') || '0');
const secondsSinceLastRefresh = (Date.now() - newLastRefresh) / 1000;
// If dashboard hasn't refreshed in the last 5 seconds, do our own refresh
if (secondsSinceLastRefresh > 5) {
console.log("Coordinated refresh time reached, fetching data");
fetchWorkerData();
} else {
console.log("Dashboard already refreshed recently, skipping coordinated refresh");
}
}, timeUntilNextRefresh);
}
}
const storedTimezone = localStorage.getItem('dashboardTimezone');
if (storedTimezone) {
window.dashboardTimezone = storedTimezone;
console.log(`Using cached timezone: ${storedTimezone}`);
}
} catch (e) {
console.error("Error reading dashboard refresh time:", e);
console.error("Error reading timezone from localStorage:", e);
}
// Check for dashboard refresh periodically
setInterval(function() {
try {
const lastDashboardRefresh = parseInt(localStorage.getItem('dashboardRefreshTime') || '0');
const now = Date.now();
const timeSinceLastRefresh = (now - lastUpdateTime) / 1000;
const timeSinceDashboardRefresh = (now - lastDashboardRefresh) / 1000;
// If dashboard refreshed more recently than we did and we haven't refreshed in at least 10 seconds
if (lastDashboardRefresh > lastUpdateTime && timeSinceLastRefresh > 10) {
console.log("Catching up with dashboard refresh");
resetProgressBar();
fetchWorkerData();
// Then fetch from server to ensure we have the latest setting
fetch('/api/timezone')
.then(response => response.json())
.then(data => {
if (data && data.timezone) {
window.dashboardTimezone = data.timezone;
console.log(`Set timezone from server: ${data.timezone}`);
// Cache for future use
try {
localStorage.setItem('dashboardTimezone', data.timezone);
} catch (e) {
console.error("Error storing timezone in localStorage:", e);
}
}
} catch (e) {
console.error("Error in periodic dashboard check:", e);
}
}, 5000); // Check every 5 seconds
}
// Server time update via polling - same as main.js
function updateServerTime() {
$.ajax({
url: "/api/time",
method: "GET",
timeout: 5000,
success: function(data) {
serverTimeOffset = new Date(data.server_timestamp).getTime() - Date.now();
serverStartTime = new Date(data.server_start_time).getTime();
},
error: function(jqXHR, textStatus, errorThrown) {
console.error("Error fetching server time:", textStatus, errorThrown);
}
});
}
// Update uptime display - synced with main dashboard
function updateUptime() {
if (serverStartTime) {
const currentServerTime = Date.now() + serverTimeOffset;
const diff = currentServerTime - serverStartTime;
const hours = Math.floor(diff / (1000 * 60 * 60));
const minutes = Math.floor((diff % (1000 * 60 * 60)) / (1000 * 60));
const seconds = Math.floor((diff % (1000 * 60)) / 1000);
$("#uptimeTimer").html("<strong>Uptime:</strong> " + hours + "h " + minutes + "m " + seconds + "s");
}
}
})
.catch(error => {
console.error("Error fetching timezone:", error);
});
})();
// Initialize page elements
function initializePage() {
// Initialize mini chart for total hashrate if the element exists
console.log("Initializing page elements...");
if (document.getElementById('total-hashrate-chart')) {
initializeMiniChart();
}
// Show loading state
$('#worker-grid').html('<div class="text-center p-5"><i class="fas fa-spinner fa-spin"></i> Loading worker data...</div>');
// Add retry button (hidden by default)
if (!$('#retry-button').length) {
$('body').append('<button id="retry-button" style="position: fixed; bottom: 20px; left: 20px; z-index: 1000; background: #f7931a; color: black; border: none; padding: 8px 16px; display: none; border-radius: 4px; cursor: pointer;">Retry Loading Data</button>');
$('#retry-button').on('click', function() {
$('#retry-button').on('click', function () {
$(this).text('Retrying...').prop('disabled', true);
fetchWorkerData(true);
setTimeout(() => {
@ -211,85 +149,172 @@ function initializePage() {
}
}
// Fetch worker data from API
function fetchWorkerData(forceRefresh = false) {
// Track this as a manual refresh for throttling purposes
lastManualRefreshTime = Date.now();
$('#worker-grid').addClass('loading-fade');
// Update progress bar to show data is being fetched
resetProgressBar();
// Choose API URL based on whether we're forcing a refresh
const apiUrl = `/api/workers${forceRefresh ? '?force=true' : ''}`;
// Update unread notifications badge in navigation
function updateNotificationBadge() {
$.ajax({
url: apiUrl,
method: 'GET',
dataType: 'json',
timeout: 15000, // 15 second timeout
success: function(data) {
workerData = data;
lastUpdateTime = Date.now();
// Update UI with new data
updateWorkerGrid();
updateSummaryStats();
updateMiniChart();
updateLastUpdated();
// Hide retry button
$('#retry-button').hide();
// Reset connection retry count
connectionRetryCount = 0;
console.log("Worker data updated successfully");
},
error: function(xhr, status, error) {
console.error("Error fetching worker data:", error);
// Show error in worker grid
$('#worker-grid').html(`
<div class="text-center p-5 text-danger">
<i class="fas fa-exclamation-triangle"></i>
<p>Error loading worker data: ${error || 'Unknown error'}</p>
</div>
`);
// Show retry button
$('#retry-button').show();
// Implement exponential backoff for automatic retry
connectionRetryCount++;
const delay = Math.min(30000, 1000 * Math.pow(1.5, Math.min(5, connectionRetryCount)));
console.log(`Will retry in ${delay/1000} seconds (attempt ${connectionRetryCount})`);
setTimeout(() => {
fetchWorkerData(true); // Force refresh on retry
}, delay);
},
complete: function() {
$('#worker-grid').removeClass('loading-fade');
url: "/api/notifications/unread_count",
method: "GET",
success: function (data) {
const unreadCount = data.unread_count;
const badge = $("#nav-unread-badge");
if (unreadCount > 0) {
badge.text(unreadCount).show();
} else {
badge.hide();
}
}
});
}
// Initialize notification badge checking
function initNotificationBadge() {
updateNotificationBadge();
setInterval(updateNotificationBadge, 60000);
}
// Server time update via polling - enhanced to use shared storage
function updateServerTime() {
console.log("Updating server time...");
try {
const storedOffset = localStorage.getItem('serverTimeOffset');
const storedStartTime = localStorage.getItem('serverStartTime');
if (storedOffset && storedStartTime) {
serverTimeOffset = parseFloat(storedOffset);
serverStartTime = parseFloat(storedStartTime);
console.log("Using stored server time offset:", serverTimeOffset, "ms");
if (typeof BitcoinMinuteRefresh !== 'undefined' && BitcoinMinuteRefresh.updateServerTime) {
BitcoinMinuteRefresh.updateServerTime(serverTimeOffset, serverStartTime);
}
return;
}
} catch (e) {
console.error("Error reading stored server time:", e);
}
$.ajax({
url: "/api/time",
method: "GET",
timeout: 5000,
success: function (data) {
serverTimeOffset = new Date(data.server_timestamp).getTime() - Date.now();
serverStartTime = new Date(data.server_start_time).getTime();
localStorage.setItem('serverTimeOffset', serverTimeOffset.toString());
localStorage.setItem('serverStartTime', serverStartTime.toString());
if (typeof BitcoinMinuteRefresh !== 'undefined' && BitcoinMinuteRefresh.updateServerTime) {
BitcoinMinuteRefresh.updateServerTime(serverTimeOffset, serverStartTime);
}
console.log("Server time synchronized. Offset:", serverTimeOffset, "ms");
},
error: function (jqXHR, textStatus, errorThrown) {
console.error("Error fetching server time:", textStatus, errorThrown);
}
});
}
// Utility functions to show/hide loader
function showLoader() {
$("#loader").show();
}
function hideLoader() {
$("#loader").hide();
}
// Fetch worker data from API with pagination, limiting to 10 pages
function fetchWorkerData(forceRefresh = false) {
console.log("Fetching worker data...");
lastManualRefreshTime = Date.now();
$('#worker-grid').addClass('loading-fade');
showLoader();
const maxPages = 10;
const requests = [];
// Create requests for pages 1 through maxPages concurrently
for (let page = 1; page <= maxPages; page++) {
const apiUrl = `/api/workers?page=${page}${forceRefresh ? '&force=true' : ''}`;
requests.push($.ajax({
url: apiUrl,
method: 'GET',
dataType: 'json',
timeout: 15000
}));
}
// Process all requests concurrently
Promise.all(requests)
.then(pages => {
let allWorkers = [];
let aggregatedData = null;
pages.forEach((data, i) => {
if (data && data.workers && data.workers.length > 0) {
allWorkers = allWorkers.concat(data.workers);
if (i === 0) {
aggregatedData = data; // preserve stats from first page
}
} else {
console.warn(`No workers found on page ${i + 1}`);
}
});
// Deduplicate workers if necessary (using worker.name as unique key)
const uniqueWorkers = allWorkers.filter((worker, index, self) =>
index === self.findIndex((w) => w.name === worker.name)
);
workerData = aggregatedData || {};
workerData.workers = uniqueWorkers;
if (typeof BitcoinMinuteRefresh !== 'undefined' && BitcoinMinuteRefresh.notifyRefresh) {
BitcoinMinuteRefresh.notifyRefresh();
}
updateWorkerGrid();
updateSummaryStats();
updateMiniChart();
updateLastUpdated();
$('#retry-button').hide();
connectionRetryCount = 0;
console.log("Worker data updated successfully");
$('#worker-grid').removeClass('loading-fade');
})
.catch(error => {
console.error("Error fetching worker data:", error);
})
.finally(() => {
hideLoader();
});
}
// Refresh worker data every 60 seconds
setInterval(function () {
console.log("Refreshing worker data at " + new Date().toLocaleTimeString());
fetchWorkerData();
}, 60000);
// Update the worker grid with data
// UPDATED FUNCTION
function updateWorkerGrid() {
console.log("Updating worker grid...");
if (!workerData || !workerData.workers) {
console.error("No worker data available");
return;
}
const workerGrid = $('#worker-grid');
workerGrid.empty();
// Apply current filters before rendering
const filteredWorkers = filterWorkersData(workerData.workers);
if (filteredWorkers.length === 0) {
workerGrid.html(`
<div class="text-center p-5">
@ -299,139 +324,94 @@ function updateWorkerGrid() {
`);
return;
}
// Calculate total unpaid earnings (from the dashboard)
const totalUnpaidEarnings = workerData.total_earnings || 0;
// Sum up hashrates of online workers to calculate share percentages
const totalHashrate = workerData.workers
.filter(w => w.status === 'online')
.reduce((sum, w) => sum + parseFloat(w.hashrate_3hr || 0), 0);
// Calculate share percentage for each worker
const onlineWorkers = workerData.workers.filter(w => w.status === 'online');
const offlineWorkers = workerData.workers.filter(w => w.status === 'offline');
// Allocate 95% to online workers, 5% to offline workers
const onlinePool = totalUnpaidEarnings * 0.95;
const offlinePool = totalUnpaidEarnings * 0.05;
// Generate worker cards
filteredWorkers.forEach(worker => {
// Calculate earnings share based on hashrate proportion
let earningsDisplay = worker.earnings;
// Explicitly recalculate earnings share for display consistency
if (worker.status === 'online' && totalHashrate > 0) {
const hashrateShare = parseFloat(worker.hashrate_3hr || 0) / totalHashrate;
earningsDisplay = (onlinePool * hashrateShare).toFixed(8);
} else if (worker.status === 'offline' && offlineWorkers.length > 0) {
earningsDisplay = (offlinePool / offlineWorkers.length).toFixed(8);
}
// Create worker card
const card = $('<div class="worker-card"></div>');
// Add class based on status
if (worker.status === 'online') {
card.addClass('worker-card-online');
} else {
card.addClass('worker-card-offline');
}
// Add worker type badge
card.append(`<div class="worker-type">${worker.type}</div>`);
// Add worker name
card.append(`<div class="worker-name">${worker.name}</div>`);
// Add status badge
if (worker.status === 'online') {
card.append('<div class="status-badge status-badge-online">ONLINE</div>');
} else {
card.append('<div class="status-badge status-badge-offline">OFFLINE</div>');
}
// Add hashrate bar
const maxHashrate = 200; // TH/s - adjust based on your fleet
const hashratePercent = Math.min(100, (worker.hashrate_3hr / maxHashrate) * 100);
card.append(`
<div class="worker-stats-row">
<div class="worker-stats-label">Hashrate (3hr):</div>
<div class="white-glow">${worker.hashrate_3hr} ${worker.hashrate_3hr_unit}</div>
</div>
<div class="stats-bar-container">
<div class="stats-bar" style="width: ${hashratePercent}%"></div>
</div>
`);
// Add additional stats - NOTE: Using recalculated earnings
card.append(`
<div class="worker-stats">
<div class="worker-stats-row">
<div class="worker-stats-label">Last Share:</div>
<div class="blue-glow">${worker.last_share.split(' ')[1]}</div>
</div>
<div class="worker-stats-row">
<div class="worker-stats-label">Earnings:</div>
<div class="green-glow">${earningsDisplay}</div>
</div>
<div class="worker-stats-row">
<div class="worker-stats-label">Accept Rate:</div>
<div class="white-glow">${worker.acceptance_rate}%</div>
</div>
<div class="worker-stats-row">
<div class="worker-stats-label">Temp:</div>
<div class="${worker.temperature > 65 ? 'red-glow' : 'white-glow'}">${worker.temperature > 0 ? worker.temperature + '°C' : 'N/A'}</div>
</div>
</div>
`);
// Add card to grid
const card = createWorkerCard(worker);
workerGrid.append(card);
});
// Verify the sum of displayed earnings equals the total
console.log(`Total unpaid earnings: ${totalUnpaidEarnings} BTC`);
console.log(`Sum of worker displayed earnings: ${
filteredWorkers.reduce((sum, w) => {
if (w.status === 'online' && totalHashrate > 0) {
const hashrateShare = parseFloat(w.hashrate_3hr || 0) / totalHashrate;
return sum + parseFloat((onlinePool * hashrateShare).toFixed(8));
} else if (w.status === 'offline' && offlineWorkers.length > 0) {
return sum + parseFloat((offlinePool / offlineWorkers.length).toFixed(8));
}
return sum;
}, 0)
} BTC`);
}
// Create worker card element
function createWorkerCard(worker) {
const card = $('<div class="worker-card"></div>');
card.addClass(worker.status === 'online' ? 'worker-card-online' : 'worker-card-offline');
card.append(`<div class="worker-type">${worker.type}</div>`);
card.append(`<div class="worker-name">${worker.name}</div>`);
card.append(`<div class="status-badge ${worker.status === 'online' ? 'status-badge-online' : 'status-badge-offline'}">${worker.status.toUpperCase()}</div>`);
const maxHashrate = 125; // TH/s - adjust based on your fleet
const normalizedHashrate = normalizeHashrate(worker.hashrate_3hr, worker.hashrate_3hr_unit || 'th/s');
const hashratePercent = Math.min(100, (normalizedHashrate / maxHashrate) * 100);
const formattedHashrate = formatHashrateForDisplay(worker.hashrate_3hr, worker.hashrate_3hr_unit || 'th/s');
card.append(`
<div class="worker-stats-row">
<div class="worker-stats-label">Hashrate (3hr):</div>
<div class="white-glow">${formattedHashrate}</div>
</div>
<div class="stats-bar-container">
<div class="stats-bar" style="width: ${hashratePercent}%"></div>
</div>
`);
// Format the last share using the proper method for timezone conversion
let formattedLastShare = 'N/A';
if (worker.last_share && typeof worker.last_share === 'string') {
// This is a more reliable method for timezone conversion
try {
// The worker.last_share is likely in format "YYYY-MM-DD HH:MM"
// We need to consider it as UTC and convert to the configured timezone
// Create a proper date object, ensuring UTC interpretation
const dateWithoutTZ = new Date(worker.last_share + 'Z'); // Adding Z to treat as UTC
// Format it according to the configured timezone
formattedLastShare = dateWithoutTZ.toLocaleString('en-US', {
hour: '2-digit',
minute: '2-digit',
hour12: true,
timeZone: window.dashboardTimezone || 'America/Los_Angeles'
});
} catch (e) {
console.error("Error formatting last share time:", e, worker.last_share);
formattedLastShare = worker.last_share; // Fallback to original value
}
}
card.append(`
<div class="worker-stats">
<div class="worker-stats-row">
<div class="worker-stats-label">Last Share:</div>
<div class="blue-glow">${formattedLastShare}</div>
</div>
<div class="worker-stats-row">
<div class="worker-stats-label">Earnings:</div>
<div class="green-glow">${worker.earnings.toFixed(8)}</div>
</div>
</div>
`);
return card;
}
// Filter worker data based on current filter state
function filterWorkersData(workers) {
if (!workers) return [];
return workers.filter(worker => {
const workerName = worker.name.toLowerCase();
const workerName = (worker.name || '').toLowerCase();
const isOnline = worker.status === 'online';
const workerType = worker.type.toLowerCase();
// Check if worker matches filter
let matchesFilter = false;
if (filterState.currentFilter === 'all') {
matchesFilter = true;
} else if (filterState.currentFilter === 'online' && isOnline) {
matchesFilter = true;
} else if (filterState.currentFilter === 'offline' && !isOnline) {
matchesFilter = true;
} else if (filterState.currentFilter === 'asic' && workerType === 'asic') {
matchesFilter = true;
} else if (filterState.currentFilter === 'fpga' && workerType === 'fpga') {
matchesFilter = true;
}
// Check if worker matches search term
const matchesSearch = workerName.includes(filterState.searchTerm);
const workerType = (worker.type || '').toLowerCase();
const matchesFilter = filterState.currentFilter === 'all' ||
(filterState.currentFilter === 'online' && isOnline) ||
(filterState.currentFilter === 'offline' && !isOnline) ||
(filterState.currentFilter === 'asic' && workerType === 'asic') ||
(filterState.currentFilter === 'bitaxe' && workerType === 'bitaxe');
const matchesSearch = filterState.searchTerm === '' || workerName.includes(filterState.searchTerm);
return matchesFilter && matchesSearch;
});
}
@ -439,49 +419,42 @@ function filterWorkersData(workers) {
// Apply filter to rendered worker cards
function filterWorkers() {
if (!workerData || !workerData.workers) return;
// Re-render the worker grid with current filters
updateWorkerGrid();
}
// Modified updateSummaryStats function for workers.js
// Update summary stats with normalized hashrate display
function updateSummaryStats() {
if (!workerData) return;
// Update worker counts
$('#workers-count').text(workerData.workers_total || 0);
$('#workers-online').text(workerData.workers_online || 0);
$('#workers-offline').text(workerData.workers_offline || 0);
// Update worker ring percentage
const onlinePercent = workerData.workers_total > 0 ?
workerData.workers_online / workerData.workers_total : 0;
const onlinePercent = workerData.workers_total > 0 ? workerData.workers_online / workerData.workers_total : 0;
$('.worker-ring').css('--online-percent', onlinePercent);
// IMPORTANT: Update total hashrate using EXACT format matching main dashboard
// This ensures the displayed value matches exactly what's on the main page
if (workerData.total_hashrate !== undefined) {
// Format with exactly 1 decimal place - matches main dashboard format
const formattedHashrate = Number(workerData.total_hashrate).toFixed(1);
$('#total-hashrate').text(`${formattedHashrate} ${workerData.hashrate_unit || 'TH/s'}`);
} else {
$('#total-hashrate').text(`0.0 ${workerData.hashrate_unit || 'TH/s'}`);
}
// Update other summary stats
const formattedHashrate = workerData.total_hashrate !== undefined ?
formatHashrateForDisplay(workerData.total_hashrate, workerData.hashrate_unit || 'TH/s') :
'0.0 TH/s';
$('#total-hashrate').text(formattedHashrate);
$('#total-earnings').text(`${(workerData.total_earnings || 0).toFixed(8)} BTC`);
$('#daily-sats').text(`${numberWithCommas(workerData.daily_sats || 0)} sats`);
$('#avg-acceptance-rate').text(`${(workerData.avg_acceptance_rate || 0).toFixed(2)}%`);
$('#daily-sats').text(`${numberWithCommas(workerData.daily_sats || 0)} SATS`);
}
// Initialize mini chart
function initializeMiniChart() {
const ctx = document.getElementById('total-hashrate-chart').getContext('2d');
// Generate some sample data to start
console.log("Initializing mini chart...");
const ctx = document.getElementById('total-hashrate-chart');
if (!ctx) {
console.error("Mini chart canvas not found");
return;
}
const labels = Array(24).fill('').map((_, i) => i);
const data = [750, 760, 755, 770, 780, 775, 760, 765, 770, 775, 780, 790, 785, 775, 770, 765, 780, 785, 775, 770, 775, 780, 775, 774.8];
const data = Array(24).fill(0).map(() => Math.random() * 100 + 700);
miniChart = new Chart(ctx, {
type: 'line',
data: {
@ -501,7 +474,7 @@ function initializeMiniChart() {
maintainAspectRatio: false,
scales: {
x: { display: false },
y: {
y: {
display: false,
min: Math.min(...data) * 0.9,
max: Math.max(...data) * 1.1
@ -521,116 +494,67 @@ function initializeMiniChart() {
});
}
// Update mini chart with real data
// Update mini chart with real data and normalization
function updateMiniChart() {
if (!miniChart || !workerData || !workerData.hashrate_history) return;
// Extract hashrate data from history
if (!miniChart || !workerData || !workerData.hashrate_history) {
console.log("Skipping mini chart update - missing data");
return;
}
const historyData = workerData.hashrate_history;
if (!historyData || historyData.length === 0) return;
// Get the values for the chart
const values = historyData.map(item => parseFloat(item.value) || 0);
if (!historyData || historyData.length === 0) {
console.log("No hashrate history data available");
return;
}
const values = historyData.map(item => normalizeHashrate(parseFloat(item.value) || 0, item.unit || workerData.hashrate_unit || 'th/s'));
const labels = historyData.map(item => item.time);
// Update chart data
miniChart.data.labels = labels;
miniChart.data.datasets[0].data = values;
// Update y-axis range
const min = Math.min(...values);
const max = Math.max(...values);
const min = Math.min(...values.filter(v => v > 0)) || 0;
const max = Math.max(...values) || 1;
miniChart.options.scales.y.min = min * 0.9;
miniChart.options.scales.y.max = max * 1.1;
// Update the chart
miniChart.update('none');
}
// Update progress bar
function updateProgressBar() {
if (currentProgress < PROGRESS_MAX) {
currentProgress++;
const progressPercent = (currentProgress / PROGRESS_MAX) * 100;
$("#bitcoin-progress-inner").css("width", progressPercent + "%");
// Add glowing effect when close to completion
if (progressPercent > 80) {
$("#bitcoin-progress-inner").addClass("glow-effect");
} else {
$("#bitcoin-progress-inner").removeClass("glow-effect");
}
// Update remaining seconds text
let remainingSeconds = PROGRESS_MAX - currentProgress;
if (remainingSeconds <= 0) {
$("#progress-text").text("Waiting for update...");
$("#bitcoin-progress-inner").addClass("waiting-for-update");
} else {
$("#progress-text").text(remainingSeconds + "s to next update");
$("#bitcoin-progress-inner").removeClass("waiting-for-update");
}
// Check for main dashboard refresh near the end to ensure sync
if (currentProgress >= 55) { // When we're getting close to refresh time
try {
const lastDashboardRefresh = parseInt(localStorage.getItem('dashboardRefreshTime') || '0');
const secondsSinceDashboardRefresh = (Date.now() - lastDashboardRefresh) / 1000;
// If main dashboard just refreshed (within last 5 seconds)
if (secondsSinceDashboardRefresh <= 5) {
console.log("Detected recent dashboard refresh, syncing now");
resetProgressBar();
fetchWorkerData();
return;
}
} catch (e) {
console.error("Error checking dashboard refresh status:", e);
}
}
} else {
// Reset progress bar if it's time to refresh
// But first check if the main dashboard refreshed recently
try {
const lastDashboardRefresh = parseInt(localStorage.getItem('dashboardRefreshTime') || '0');
const secondsSinceDashboardRefresh = (Date.now() - lastDashboardRefresh) / 1000;
// If dashboard refreshed in the last 10 seconds, wait for it instead of refreshing ourselves
if (secondsSinceDashboardRefresh < 10) {
console.log("Waiting for dashboard refresh event instead of refreshing independently");
return;
}
} catch (e) {
console.error("Error checking dashboard refresh status:", e);
}
// If main dashboard hasn't refreshed recently, do our own refresh
if (Date.now() - lastUpdateTime > PROGRESS_MAX * 1000) {
console.log("Progress bar expired, fetching data");
fetchWorkerData();
}
}
}
// Reset progress bar
function resetProgressBar() {
currentProgress = 0;
$("#bitcoin-progress-inner").css("width", "0%");
$("#bitcoin-progress-inner").removeClass("glow-effect");
$("#bitcoin-progress-inner").removeClass("waiting-for-update");
$("#progress-text").text(PROGRESS_MAX + "s to next update");
}
// Update the last updated timestamp
function updateLastUpdated() {
if (!workerData || !workerData.timestamp) return;
try {
const timestamp = new Date(workerData.timestamp);
$("#lastUpdated").html("<strong>Last Updated:</strong> " +
timestamp.toLocaleString() + "<span id='terminal-cursor'></span>");
// Get the configured timezone with a fallback
const configuredTimezone = window.dashboardTimezone || 'America/Los_Angeles';
// Format with the configured timezone
const options = {
year: 'numeric',
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit',
second: '2-digit',
hour12: true,
timeZone: configuredTimezone // Explicitly use the configured timezone
};
// Format the timestamp and update the DOM
const formattedTime = timestamp.toLocaleString('en-US', options);
$("#lastUpdated").html("<strong>Last Updated:</strong> " +
formattedTime + "<span id='terminal-cursor'></span>");
console.log(`Last updated timestamp using timezone: ${configuredTimezone}`);
} catch (e) {
console.error("Error formatting timestamp:", e);
// Fallback to basic timestamp if there's an error
$("#lastUpdated").html("<strong>Last Updated:</strong> " +
new Date().toLocaleString() + "<span id='terminal-cursor'></span>");
}
}

178
templates/base.html Normal file
View File

@ -0,0 +1,178 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{% block title %}BTC-OS MINING DASHBOARD {% endblock %}</title>
<!-- Common fonts -->
<link href="https://fonts.googleapis.com/css2?family=Orbitron:wght@400;700&family=VT323&display=swap" rel="stylesheet">
<!-- Font Awesome -->
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.4.0/css/all.min.css" crossorigin="anonymous" referrerpolicy="no-referrer" />
<!-- Bootstrap -->
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
<!-- Common CSS -->
<link rel="stylesheet" href="/static/css/common.css">
<!-- Custom CSS -->
<link rel="stylesheet" href="/static/css/theme-toggle.css">
<!-- Theme JS (added to ensure consistent application of theme) -->
<script src="/static/js/theme.js"></script>
<!-- Page-specific CSS -->
{% block css %}{% endblock %}
<script>
// Execute this immediately to preload theme
(function () {
const useDeepSea = localStorage.getItem('useDeepSeaTheme') === 'true';
const themeClass = useDeepSea ? 'deepsea-theme' : 'bitcoin-theme';
// Apply theme class to html element
document.documentElement.classList.add(themeClass);
// Create and add loader
document.addEventListener('DOMContentLoaded', function () {
// Create loader element
const loader = document.createElement('div');
loader.id = 'theme-loader';
const icon = document.createElement('div');
icon.id = 'loader-icon';
icon.innerHTML = useDeepSea ? '🌊' : '₿';
const text = document.createElement('div');
text.id = 'loader-text';
text.textContent = 'Loading ' + (useDeepSea ? 'DeepSea' : 'Bitcoin') + ' Theme';
loader.appendChild(icon);
loader.appendChild(text);
document.body.appendChild(loader);
// Add fade-in effect for content once theme is loaded
setTimeout(function () {
document.body.style.visibility = 'visible';
// Fade out loader
loader.style.transition = 'opacity 0.5s ease';
loader.style.opacity = '0';
// Remove loader after fade
setTimeout(function () {
if (loader && loader.parentNode) {
loader.parentNode.removeChild(loader);
}
}, 500);
}, 300);
});
})();
</script>
</head>
<body>
<script>
// Add underwater effects for DeepSea theme
document.addEventListener('DOMContentLoaded', function () {
// Check if DeepSea theme is active
if (localStorage.getItem('useDeepSeaTheme') === 'true') {
// Create underwater light rays
const rays = document.createElement('div');
rays.className = 'underwater-rays';
document.body.appendChild(rays);
// Create digital noise
const noise = document.createElement('div');
noise.className = 'digital-noise';
document.body.appendChild(noise);
}
});
</script>
<div class="container-fluid">
<!-- Connection status indicator -->
<div id="connectionStatus"></div>
<h1 class="text-center">
<a href="/" style="text-decoration:none; color:inherit;">
{% block header %}BTC-OS MINING DASHBOARD{% endblock %}
</a>
</h1>
<!-- Top right link -->
<a href="https://x.com/DJObleezy" id="topRightLink" target="_blank" rel="noopener noreferrer">MADE BY @DJO₿LEEZY</a>
<!-- Theme toggle button (new) -->
<button id="themeToggle" class="theme-toggle-btn">
<span>Toggle Theme</span>
</button>
{% block last_updated %}
<p class="text-center" id="lastUpdated" style="color: #f7931a; text-transform: uppercase;"><strong>LAST UPDATED:</strong> {{ current_time }}<span id="terminal-cursor"></span></p>
{% endblock %}
{% block navigation %}
<div class="navigation-links">
<a href="/dashboard" class="nav-link {% block dashboard_active %}{% endblock %}">DASHBOARD</a>
<a href="/workers" class="nav-link {% block workers_active %}{% endblock %}">WORKERS</a>
<a href="/blocks" class="nav-link {% block blocks_active %}{% endblock %}">BLOCKS</a>
<a href="/notifications" class="nav-link {% block notifications_active %}{% endblock %}">
NOTIFICATIONS
<span id="nav-unread-badge" class="nav-badge"></span>
</a>
</div>
{% endblock %}
<!-- Main content area -->
{% block content %}{% endblock %}
<!-- Hidden Congrats Message -->
{% block congrats_message %}
<div id="congratsMessage" style="display:none; position: fixed; top: 20px; left: 50%; transform: translateX(-50%); z-index: 1000; background: #f7931a; color: #000; padding: 10px; border-radius: 5px; box-shadow: 0 0 15px rgba(247, 147, 26, 0.7);"></div>
{% endblock %}
<!-- Footer -->
<footer class="footer text-center">
<p>Not affiliated with <a href="https://www.Ocean.xyz">Ocean.xyz</a></p>
</footer>
</div>
<!-- External JavaScript libraries -->
<script src="https://code.jquery.com/jquery-3.7.0.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<script src="https://cdn.jsdelivr.net/npm/chartjs-plugin-annotation@1.1.0"></script>
<!-- Theme toggle initialization -->
<script>
document.addEventListener('DOMContentLoaded', function () {
// Initialize theme toggle button based on current theme
const themeToggle = document.getElementById('themeToggle');
if (themeToggle) {
// Check current theme
const isDeepSea = localStorage.getItem('useDeepSeaTheme') === 'true';
// Update button style based on theme
if (isDeepSea) {
themeToggle.style.borderColor = '#0088cc';
themeToggle.style.color = '#0088cc';
} else {
themeToggle.style.borderColor = '#f2a900';
themeToggle.style.color = '#f2a900';
}
// Add click event listener
themeToggle.addEventListener('click', function () {
toggleTheme(); // This will now trigger a page refresh
});
}
});
</script>
<!-- Page-specific JavaScript -->
{% block javascript %}{% endblock %}
<!-- Bitcoin Progress Bar -->
<script src="/static/js/BitcoinProgressBar.js"></script>
</body>
</html>

94
templates/blocks.html Normal file
View File

@ -0,0 +1,94 @@
{% extends "base.html" %}
{% block title %}BLOCKS - BTC-OS MINING DASHBOARD {% endblock %}
{% block css %}
<link rel="stylesheet" href="/static/css/blocks.css">
<link rel="stylesheet" href="/static/css/theme-toggle.css">
{% endblock %}
{% block header %}BLOCKCHAIN MONITOR v 0.1{% endblock %}
{% block blocks_active %}active{% endblock %}
{% block content %}
<!-- Latest block stats -->
<div class="row mb-2 equal-height">
<div class="col-12">
<div class="card">
<div class="card-header">LATEST BLOCK STATS</div>
<div class="card-body">
<div class="latest-block-stats">
<div class="stat-item">
<strong>BLOCK HEIGHT:</strong>
<span id="latest-height" class="metric-value white">Loading...</span>
</div>
<div class="stat-item">
<strong>TIME:</strong>
<span id="latest-time" class="metric-value blue">Loading...</span>
</div>
<div class="stat-item">
<strong>TRANSACTIONS:</strong>
<span id="latest-tx-count" class="metric-value white">Loading...</span>
</div>
<div class="stat-item">
<strong>SIZE:</strong>
<span id="latest-size" class="metric-value white">Loading...</span>
</div>
<div class="stat-item">
<strong>DIFFICULTY:</strong>
<span id="latest-difficulty" class="metric-value yellow">Loading...</span>
</div>
<div class="stat-item">
<strong>POOL:</strong>
<span id="latest-pool" class="metric-value green">Loading...</span>
</div>
<div class="stat-item">
<strong>AVG FEE RATE:</strong>
<span id="latest-fee-rate" class="metric-value yellow" style="animation: pulse 1s infinite;">Loading...</span>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- Blocks grid -->
<div class="row">
<div class="col-12">
<div class="card">
<div class="card-header">RECENT BLOCKS</div>
<div class="card-body">
<div class="blocks-container">
<div id="blocks-grid" class="blocks-grid">
<!-- Blocks will be generated here via JavaScript -->
<div class="loader">
<span class="loader-text">Connecting to mempool.guide API<span class="terminal-cursor"></span></span>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- Block details modal -->
<div id="block-modal" class="block-modal">
<div class="block-modal-content">
<div class="block-modal-header">
<span class="block-modal-title">BLOCK DETAILS</span>
<span class="block-modal-close">&times;</span>
</div>
<div class="block-modal-body">
<div id="block-details">
<!-- Block details will be displayed here -->
</div>
</div>
</div>
</div>
{% endblock %}
{% block javascript %}
<script src="/static/js/blocks.js"></script>
{% endblock %}

File diff suppressed because it is too large Load Diff

375
templates/dashboard.html Normal file
View File

@ -0,0 +1,375 @@
{% extends "base.html" %}
{% block title %}BTC-OS Mining Dashboard {% endblock %}
{% block css %}
<link rel="stylesheet" href="/static/css/dashboard.css">
<link rel="stylesheet" href="/static/css/theme-toggle.css">
{% endblock %}
{% block dashboard_active %}active{% endblock %}
{% block content %}
<!-- Graph Container -->
<div id="graphContainer" class="mb-2">
<canvas id="trendGraph" style="width: 100%; height: 100%; position: relative; z-index: 2;"></canvas>
</div>
<!-- Miner Status and Payout Info -->
<div class="row mb-2 equal-height">
<div class="col-md-6">
<div class="card">
<div class="card-header">Miner Status</div>
<div class="card-body">
<p>
<strong>Status:</strong>
<span id="miner_status" class="metric-value">
{% if metrics and metrics.workers_hashing and metrics.workers_hashing > 0 %}
<span class="status-green">ONLINE</span> <span class="online-dot"></span>
{% else %}
<span class="status-red">OFFLINE</span> <span class="offline-dot"></span>
{% endif %}
</span>
</p>
<p>
<strong>Workers Hashing:</strong>
<span id="workers_hashing" class="metric-value">{{ metrics.workers_hashing or 0 }}</span>
<span id="indicator_workers_hashing"></span>
</p>
<p>
<strong>Last Share:</strong>
<span id="last_share" class="metric-value">{{ metrics.total_last_share or "N/A" }}</span>
</p>
<p>
<strong>Blocks Found:</strong>
<span id="blocks_found" class="metric-value white">
{{ metrics.blocks_found if metrics and metrics.blocks_found else "0" }}
</span>
<span id="indicator_blocks_found"></span>
</p>
</div>
</div>
</div>
<div class="col-md-6">
<div class="card" id="payoutMiscCard">
<div class="card-header">Payout Info</div>
<div class="card-body">
<p>
<strong>Unpaid Earnings:</strong>
<span id="unpaid_earnings" class="metric-value green">
{% if metrics and metrics.unpaid_earnings %}
{{ metrics.unpaid_earnings }} BTC
{% else %}
0 BTC
{% endif %}
</span>
<span id="indicator_unpaid_earnings"></span>
</p>
<p>
<strong>Last Block:</strong>
<span id="last_block_height" class="metric-value white">
{{ metrics.last_block_height|commafy if metrics and metrics.last_block_height else "N/A" }}
</span>
<span id="last_block_time" class="metric-value blue">
{{ metrics.last_block_time if metrics and metrics.last_block_time else "N/A" }}
</span>
<span class="green">
{% if metrics and metrics.last_block_earnings %}
+{{ metrics.last_block_earnings|int|commafy }} SATS
{% else %}
+0 SATS
{% endif %}
</span>
<span id="indicator_last_block"></span>
</p>
<p>
<strong>Est. Time to Payout:</strong>
<span id="est_time_to_payout" class="metric-value yellow">
{{ metrics.est_time_to_payout if metrics and metrics.est_time_to_payout else "N/A" }}
</span>
<span id="indicator_est_time_to_payout"></span>
</p>
<p>
<strong>Pool Fees:</strong>
<span id="pool_fees_percentage" class="metric-value">
{% if metrics and metrics.pool_fees_percentage is defined and metrics.pool_fees_percentage is not none %}
{{ metrics.pool_fees_percentage }}%
{% if metrics.pool_fees_percentage is not none and metrics.pool_fees_percentage >= 0.9 and metrics.pool_fees_percentage <= 1.3 %}
<span class="fee-star"></span> <span class="datum-label">DATUM</span> <span class="fee-star"></span>
{% endif %}
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_pool_fees_percentage"></span>
</p>
</div>
</div>
</div>
</div>
<!-- Pool Hashrates and Bitcoin Network Stats -->
<div class="row equal-height">
<div class="col-md-6">
<div class="card">
<div class="card-header">Pool Hashrates</div>
<div class="card-body">
<p>
<strong>Pool Hashrate:</strong>
<span id="pool_total_hashrate" class="metric-value white">
{% if metrics and metrics.pool_total_hashrate and metrics.pool_total_hashrate_unit %}
{{ metrics.pool_total_hashrate }} {{ metrics.pool_total_hashrate_unit[:-2]|upper ~ metrics.pool_total_hashrate_unit[-2:] }}
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_pool_total_hashrate"></span>
</p>
<hr>
<p>
<strong>24hr Avg Hashrate:</strong>
<span id="hashrate_24hr" class="metric-value white">
{% if metrics and metrics.hashrate_24hr %}
{{ metrics.hashrate_24hr }}
{% if metrics.hashrate_24hr_unit %}
{{ metrics.hashrate_24hr_unit[:-2]|upper ~ metrics.hashrate_24hr_unit[-2:] }}
{% else %}
TH/s
{% endif %}
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_hashrate_24hr"></span>
</p>
<p>
<strong>3hr Avg Hashrate:</strong>
<span id="hashrate_3hr" class="metric-value white">
{% if metrics and metrics.hashrate_3hr %}
{{ metrics.hashrate_3hr }}
{% if metrics.hashrate_3hr_unit %}
{{ metrics.hashrate_3hr_unit[:-2]|upper ~ metrics.hashrate_3hr_unit[-2:] }}
{% else %}
TH/s
{% endif %}
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_hashrate_3hr"></span>
</p>
<p>
<strong>10min Avg Hashrate:</strong>
<span id="hashrate_10min" class="metric-value white">
{% if metrics and metrics.hashrate_10min %}
{{ metrics.hashrate_10min }}
{% if metrics.hashrate_10min_unit %}
{{ metrics.hashrate_10min_unit[:-2]|upper ~ metrics.hashrate_10min_unit[-2:] }}
{% else %}
TH/s
{% endif %}
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_hashrate_10min"></span>
</p>
<p>
<strong>60sec Avg Hashrate:</strong>
<span id="hashrate_60sec" class="metric-value white">
{% if metrics and metrics.hashrate_60sec %}
{{ metrics.hashrate_60sec }}
{% if metrics.hashrate_60sec_unit %}
{{ metrics.hashrate_60sec_unit[:-2]|upper ~ metrics.hashrate_60sec_unit[-2:] }}
{% else %}
TH/s
{% endif %}
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_hashrate_60sec"></span>
</p>
</div>
</div>
</div>
<div class="col-md-6">
<div class="card">
<div class="card-header">Network Stats</div>
<div class="card-body">
<p>
<strong>BTC Price:</strong>
<span id="btc_price" class="metric-value yellow">
{% if metrics and metrics.btc_price %}
${{ "%.2f"|format(metrics.btc_price) }}
{% else %}
$0.00
{% endif %}
</span>
<span id="indicator_btc_price"></span>
</p>
<p>
<strong>Block Number:</strong>
<span id="block_number" class="metric-value white">
{% if metrics and metrics.block_number %}
{{ metrics.block_number|commafy }}
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_block_number"></span>
</p>
<p>
<strong>Network Hashrate:</strong>
<span id="network_hashrate" class="metric-value white">
{% if metrics and metrics.network_hashrate %}
{{ metrics.network_hashrate|round|commafy }} EH/s
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_network_hashrate"></span>
</p>
<p>
<strong>Difficulty:</strong>
<span id="difficulty" class="metric-value white">
{% if metrics and metrics.difficulty %}
{{ metrics.difficulty|round|commafy }}
{% else %}
N/A
{% endif %}
</span>
<span id="indicator_difficulty"></span>
</p>
</div>
</div>
</div>
</div>
<!-- Satoshi and USD Metrics -->
<div class="row equal-height">
<div class="col-md-6">
<div class="card">
<div class="card-header">SATOSHI EARNINGS</div>
<div class="card-body">
<p>
<strong>Projected Daily (Net):</strong>
<span id="daily_mined_sats" class="metric-value yellow">
{% if metrics and metrics.daily_mined_sats %}
{{ metrics.daily_mined_sats|commafy }} SATS
{% else %}
0 sats
{% endif %}
</span>
<span id="indicator_daily_mined_sats"></span>
</p>
<p>
<strong>Projected Monthly (Net):</strong>
<span id="monthly_mined_sats" class="metric-value yellow">
{% if metrics and metrics.monthly_mined_sats %}
{{ metrics.monthly_mined_sats|commafy }} SATS
{% else %}
0 sats
{% endif %}
</span>
<span id="indicator_monthly_mined_sats"></span>
</p>
<p>
<strong>Est. Earnings/Day:</strong>
<span id="estimated_earnings_per_day_sats" class="metric-value yellow">
{% if metrics and metrics.estimated_earnings_per_day_sats %}
{{ metrics.estimated_earnings_per_day_sats|commafy }} SATS
{% else %}
0 sats
{% endif %}
</span>
<span id="indicator_estimated_earnings_per_day_sats"></span>
</p>
<p>
<strong>Est. Earnings/Block:</strong>
<span id="estimated_earnings_next_block_sats" class="metric-value yellow">
{% if metrics and metrics.estimated_earnings_next_block_sats %}
{{ metrics.estimated_earnings_next_block_sats|commafy }} SATS
{% else %}
0 sats
{% endif %}
</span>
<span id="indicator_estimated_earnings_next_block_sats"></span>
</p>
<p>
<strong>Est. Rewards in Window:</strong>
<span id="estimated_rewards_in_window_sats" class="metric-value yellow">
{% if metrics and metrics.estimated_rewards_in_window_sats %}
{{ metrics.estimated_rewards_in_window_sats|commafy }} SATS
{% else %}
0 sats
{% endif %}
</span>
<span id="indicator_estimated_rewards_in_window_sats"></span>
</p>
</div>
</div>
</div>
<div class="col-md-6">
<div class="card">
<div class="card-header">USD EARNINGS</div>
<div class="card-body">
<p>
<strong>Daily Revenue:</strong>
<span id="daily_revenue" class="metric-value green">
{% if metrics and metrics.daily_revenue is defined and metrics.daily_revenue is not none %}
${{ "%.2f"|format(metrics.daily_revenue) }}
{% else %}
$0.00
{% endif %}
</span>
<span id="indicator_daily_revenue"></span>
</p>
<p>
<strong>Daily Power Cost:</strong>
<span id="daily_power_cost" class="metric-value red">
{% if metrics and metrics.daily_power_cost is defined and metrics.daily_power_cost is not none %}
${{ "%.2f"|format(metrics.daily_power_cost) }}
{% else %}
$0.00
{% endif %}
</span>
<span id="indicator_daily_power_cost"></span>
</p>
<p>
<strong>Daily Profit (USD):</strong>
<span id="daily_profit_usd" class="metric-value green">
{% if metrics and metrics.daily_profit_usd is defined and metrics.daily_profit_usd is not none %}
${{ "%.2f"|format(metrics.daily_profit_usd) }}
{% else %}
$0.00
{% endif %}
</span>
<span id="indicator_daily_profit_usd"></span>
</p>
<p>
<strong>Monthly Profit (USD):</strong>
<span id="monthly_profit_usd" class="metric-value green">
{% if metrics and metrics.monthly_profit_usd is defined and metrics.monthly_profit_usd is not none %}
${{ "%.2f"|format(metrics.monthly_profit_usd) }}
{% else %}
$0.00
{% endif %}
</span>
<span id="indicator_monthly_profit_usd"></span>
</p>
</div>
</div>
</div>
</div>
{% endblock %}
{% block javascript %}
<!-- External JavaScript file with our application logic -->
<script src="/static/js/main.js"></script>
{% endblock %}

View File

@ -1,161 +1,22 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Error - Mining Dashboard</title>
<!-- Include both Orbitron and VT323 fonts -->
<link href="https://fonts.googleapis.com/css2?family=Orbitron:wght@400;700&family=VT323&display=swap" rel="stylesheet">
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
<style>
:root {
--bg-color: #0a0a0a;
--bg-gradient: linear-gradient(135deg, #0a0a0a, #1a1a1a);
--primary-color: #f7931a;
--text-color: white;
--terminal-font: 'VT323', monospace;
--header-font: 'Orbitron', sans-serif;
}
/* CRT Screen Effect */
body::before {
content: " ";
display: block;
position: fixed;
top: 0; left: 0; bottom: 0; right: 0;
background: linear-gradient(rgba(18, 16, 16, 0) 50%, rgba(0, 0, 0, 0.1) 50%),
linear-gradient(90deg, rgba(255, 0, 0, 0.03), rgba(0, 255, 0, 0.02), rgba(0, 0, 255, 0.03));
background-size: 100% 2px, 3px 100%;
pointer-events: none;
z-index: 2;
opacity: 0.15;
}
/* Flicker Animation */
@keyframes flicker {
0% { opacity: 0.97; }
5% { opacity: 0.95; }
10% { opacity: 0.97; }
15% { opacity: 0.94; }
20% { opacity: 0.98; }
50% { opacity: 0.95; }
80% { opacity: 0.96; }
90% { opacity: 0.94; }
100% { opacity: 0.98; }
}
body {
background: var(--bg-gradient);
color: var(--text-color);
padding-top: 50px;
font-family: var(--terminal-font);
text-shadow: 0 0 5px rgba(255, 255, 255, 0.3);
}
a.btn-primary {
background-color: var(--primary-color);
border-color: var(--primary-color);
color: black;
margin-top: 20px;
font-family: var(--header-font);
text-shadow: none;
box-shadow: 0 0 10px rgba(247, 147, 26, 0.5);
transition: all 0.3s ease;
}
a.btn-primary:hover {
background-color: #ffa64d;
box-shadow: 0 0 15px rgba(247, 147, 26, 0.7);
}
/* Enhanced error container with scanlines */
.error-container {
max-width: 600px;
margin: 0 auto;
text-align: center;
padding: 2rem;
border: 1px solid var(--primary-color);
border-radius: 0;
background-color: rgba(0, 0, 0, 0.3);
box-shadow: 0 0 15px rgba(247, 147, 26, 0.3);
position: relative;
overflow: hidden;
animation: flicker 4s infinite;
}
/* Scanline effect for error container */
.error-container::after {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: repeating-linear-gradient(
0deg,
rgba(0, 0, 0, 0.1),
rgba(0, 0, 0, 0.1) 1px,
transparent 1px,
transparent 2px
);
pointer-events: none;
z-index: 1;
}
h1 {
color: var(--primary-color);
margin-bottom: 1rem;
font-family: var(--header-font);
font-weight: bold;
text-shadow: 0 0 10px var(--primary-color);
position: relative;
z-index: 2;
}
p {
margin-bottom: 1.5rem;
font-size: 1.5rem;
position: relative;
z-index: 2;
color: #ff5555;
text-shadow: 0 0 8px rgba(255, 85, 85, 0.6);
}
/* Cursor blink for terminal feel */
.terminal-cursor {
display: inline-block;
width: 10px;
height: 20px;
background-color: #f7931a;
margin-left: 2px;
animation: blink 1s step-end infinite;
vertical-align: middle;
box-shadow: 0 0 5px rgba(247, 147, 26, 0.8);
}
@keyframes blink {
0%, 100% { opacity: 1; }
50% { opacity: 0; }
}
/* Error code styling */
.error-code {
font-family: var(--terminal-font);
font-size: 1.2rem;
color: #00dfff;
text-shadow: 0 0 10px #00dfff, 0 0 20px #00dfff;
margin-bottom: 1rem;
}
</style>
</head>
<body>
<div class="container">
<div class="error-container">
<h1>ERROR</h1>
<div class="error-code">CODE: SYS_EXCEPTION_0x45</div>
<p>{{ message }}<span class="terminal-cursor"></span></p>
<a href="/" class="btn btn-primary">Return to Dashboard</a>
</div>
</div>
</body>
</html>
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Error - Mining Dashboard</title>
<!-- Include both Orbitron and VT323 fonts -->
<link href="https://fonts.googleapis.com/css2?family=Orbitron:wght@400;700&family=VT323&display=swap" rel="stylesheet">
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
<link rel="stylesheet" href="/static/css/error.css">
</head>
<body>
<div class="container">
<div class="error-container">
<h1>ERROR!</h1>
<div class="error-code">CODE: SYS_EXCEPTION_0x69420</div>
<p>{{ message }}<span class="terminal-cursor"></span></p>
<a href="/dashboard" class="btn btn-primary">RETURN TO DASHBOARD</a>
</div>
</div>
</body>
</html>

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,95 @@
{% extends "base.html" %}
{% block title %}NOTIFICATIONS - BTC-OS MINING DASHBOARD {% endblock %}
{% block css %}
<link rel="stylesheet" href="/static/css/notifications.css">
<link rel="stylesheet" href="/static/css/theme-toggle.css">
{% endblock %}
{% block header %}NOTIFICATION CENTER v 0.1{% endblock %}
{% block notifications_active %}active{% endblock %}
{% block content %}
<!-- Notification Controls -->
<div class="row mb-2">
<div class="col-12">
<div class="card">
<div class="card-header">NOTIFICATION CONTROLS</div>
<div class="card-body">
<div class="notification-controls">
<div class="filter-buttons">
<button class="filter-button active" data-filter="all">All</button>
<button class="filter-button" data-filter="hashrate">Hashrate</button>
<button class="filter-button" data-filter="block">Blocks</button>
<button class="filter-button" data-filter="worker">Workers</button>
<button class="filter-button" data-filter="earnings">Earnings</button>
<button class="filter-button" data-filter="system">System</button>
</div>
<div class="notification-actions">
<button id="mark-all-read" class="action-button">Mark All as Read</button>
<button id="clear-read" class="action-button">Clear Read Notifications</button>
<button id="clear-all" class="action-button danger">Clear All</button>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- Notifications List -->
<div class="row">
<div class="col-12">
<div class="card">
<div class="card-header">
<span>NOTIFICATIONS</span>
<span id="unread-badge" class="unread-badge">0</span>
</div>
<div class="card-body">
<div id="notifications-container">
<!-- Notifications will be populated here by JavaScript -->
<div class="loading-message">Loading notifications<span class="terminal-cursor"></span></div>
</div>
<!-- Pagination -->
<div class="pagination-controls">
<button id="load-more" class="load-more-button">LOAD MORE</button>
</div>
</div>
</div>
</div>
</div>
<!-- Empty State Template (hidden) -->
<div id="empty-template" style="display:none;">
<div class="empty-state">
<i class="fas fa-bell-slash"></i>
<p>No notifications to display</p>
</div>
</div>
<!-- Notification Template (hidden) -->
<div id="notification-template" style="display:none;">
<div class="notification-item" data-id="">
<div class="notification-icon">
<i class="fas"></i>
</div>
<div class="notification-content">
<div class="notification-message"></div>
<div class="notification-meta">
<span class="notification-time"></span>
<span class="notification-category"></span>
</div>
</div>
<div class="notification-actions">
<button class="mark-read-button"><i class="fas fa-check"></i></button>
<button class="delete-button"><i class="fas fa-trash"></i></button>
</div>
</div>
</div>
{% endblock %}
{% block javascript %}
<script src="/static/js/notifications.js"></script>
{% endblock %}

File diff suppressed because it is too large Load Diff

792
worker_service.py Normal file
View File

@ -0,0 +1,792 @@
"""
Worker service module for managing workers data.
"""
import logging
import random
from datetime import datetime, timedelta
from zoneinfo import ZoneInfo
from config import get_timezone
class WorkerService:
"""Service for retrieving and managing worker data."""
def __init__(self):
"""Initialize the worker service."""
self.worker_data_cache = None
self.last_worker_data_update = None
self.WORKER_DATA_CACHE_TIMEOUT = 60 # Cache worker data for 60 seconds
self.dashboard_service = None # Will be set by App.py during initialization
self.sats_per_btc = 100_000_000 # Constant for conversion
def set_dashboard_service(self, dashboard_service):
"""
Set the dashboard service instance - to be called from App.py
Args:
dashboard_service (MiningDashboardService): The initialized dashboard service
"""
self.dashboard_service = dashboard_service
# Immediately access the wallet from dashboard_service when it's set
if hasattr(dashboard_service, 'wallet'):
self.wallet = dashboard_service.wallet
logging.info(f"Worker service updated with new wallet: {self.wallet}")
logging.info("Dashboard service connected to worker service")
def generate_default_workers_data(self):
"""
Generate default worker data when no metrics are available.
Returns:
dict: Default worker data structure
"""
return {
"workers": [],
"workers_total": 0,
"workers_online": 0,
"workers_offline": 0,
"total_hashrate": 0.0,
"hashrate_unit": "TH/s",
"total_earnings": 0.0,
"daily_sats": 0,
"hashrate_history": [],
"timestamp": datetime.now(ZoneInfo(get_timezone())).isoformat()
}
def get_workers_data(self, cached_metrics, force_refresh=False):
"""
Get worker data with caching for better performance.
Args:
cached_metrics (dict): Cached metrics from the dashboard
force_refresh (bool): Whether to force a refresh of cached data
Returns:
dict: Worker data
"""
current_time = datetime.now().timestamp()
# Return cached data if it's still fresh and not forced to refresh
if not force_refresh and self.worker_data_cache and self.last_worker_data_update and \
(current_time - self.last_worker_data_update) < self.WORKER_DATA_CACHE_TIMEOUT:
# Even when using cached data, sync worker count with main dashboard
if cached_metrics and cached_metrics.get("workers_hashing") is not None:
self.sync_worker_counts_with_dashboard(self.worker_data_cache, cached_metrics)
logging.info("Using cached worker data")
return self.worker_data_cache
try:
# First try to get actual worker data from the dashboard service
if self.dashboard_service:
logging.info("Attempting to fetch real worker data from Ocean.xyz")
real_worker_data = self.dashboard_service.get_worker_data()
if real_worker_data and real_worker_data.get('workers') and len(real_worker_data['workers']) > 0:
# Validate that worker names are not just "Online" or "Offline"
valid_names = False
for worker in real_worker_data['workers']:
name = worker.get('name', '').lower()
if name and name not in ['online', 'offline', 'total', 'worker', 'status']:
valid_names = True
break
if valid_names:
logging.info(f"Successfully retrieved {len(real_worker_data['workers'])} real workers from Ocean.xyz")
# Add hashrate history if available in cached metrics
if cached_metrics and cached_metrics.get("arrow_history") and cached_metrics["arrow_history"].get("hashrate_3hr"):
real_worker_data["hashrate_history"] = cached_metrics["arrow_history"]["hashrate_3hr"]
# Sync with dashboard metrics to ensure consistency
if cached_metrics:
self.sync_worker_counts_with_dashboard(real_worker_data, cached_metrics)
# Update cache
self.worker_data_cache = real_worker_data
self.last_worker_data_update = current_time
return real_worker_data
else:
logging.warning("Real worker data had invalid names (like 'online'/'offline'), falling back to simulated data")
else:
logging.warning("Real worker data fetch returned no workers, falling back to simulated data")
else:
logging.warning("Dashboard service not available, cannot fetch real worker data")
# Fallback to simulated data if real data fetch fails or returns no workers
logging.info("Generating fallback simulated worker data")
worker_data = self.generate_fallback_data(cached_metrics)
# Add hashrate history if available in cached metrics
if cached_metrics and cached_metrics.get("arrow_history") and cached_metrics["arrow_history"].get("hashrate_3hr"):
worker_data["hashrate_history"] = cached_metrics["arrow_history"]["hashrate_3hr"]
# Ensure worker counts match dashboard metrics
if cached_metrics:
self.sync_worker_counts_with_dashboard(worker_data, cached_metrics)
# Update cache
self.worker_data_cache = worker_data
self.last_worker_data_update = current_time
logging.info(f"Successfully generated fallback worker data: {worker_data['workers_total']} workers")
return worker_data
except Exception as e:
logging.error(f"Error getting worker data: {e}")
fallback_data = self.generate_fallback_data(cached_metrics)
# Even on error, try to sync with dashboard metrics
if cached_metrics:
self.sync_worker_counts_with_dashboard(fallback_data, cached_metrics)
return fallback_data
def sync_worker_counts_with_dashboard(self, worker_data, dashboard_metrics):
"""
Synchronize worker counts and other metrics between worker data and dashboard metrics.
Args:
worker_data (dict): Worker data to be updated
dashboard_metrics (dict): Dashboard metrics with worker count and other data
"""
if not worker_data or not dashboard_metrics:
return
# Sync worker count
dashboard_worker_count = dashboard_metrics.get("workers_hashing")
# Only proceed if dashboard has valid worker count
if dashboard_worker_count is not None:
current_worker_count = worker_data.get("workers_total", 0)
# If counts already match, no need to sync workers count
if current_worker_count != dashboard_worker_count:
logging.info(f"Syncing worker count: worker page({current_worker_count}) → dashboard({dashboard_worker_count})")
# Update the total count
worker_data["workers_total"] = dashboard_worker_count
# Adjust online/offline counts proportionally
current_online = worker_data.get("workers_online", 0)
current_total = max(1, current_worker_count) # Avoid division by zero
# Calculate ratio of online workers
online_ratio = current_online / current_total
# Recalculate online and offline counts
new_online_count = round(dashboard_worker_count * online_ratio)
new_offline_count = dashboard_worker_count - new_online_count
# Update the counts
worker_data["workers_online"] = new_online_count
worker_data["workers_offline"] = new_offline_count
logging.info(f"Updated worker counts - Total: {dashboard_worker_count}, Online: {new_online_count}, Offline: {new_offline_count}")
# If we have worker instances, try to adjust them as well
if "workers" in worker_data and isinstance(worker_data["workers"], list):
self.adjust_worker_instances(worker_data, dashboard_worker_count)
# Sync daily sats - critical for fixing the daily sats discrepancy
if dashboard_metrics.get("daily_mined_sats") is not None:
daily_sats_value = dashboard_metrics.get("daily_mined_sats")
if daily_sats_value != worker_data.get("daily_sats"):
worker_data["daily_sats"] = daily_sats_value
logging.info(f"Synced daily sats: {worker_data['daily_sats']}")
# Sync other important metrics
if dashboard_metrics.get("total_hashrate") is not None:
worker_data["total_hashrate"] = dashboard_metrics.get("total_hashrate")
if dashboard_metrics.get("unpaid_earnings") is not None:
# Attempt to convert string to float if needed
unpaid_value = dashboard_metrics.get("unpaid_earnings")
if isinstance(unpaid_value, str):
try:
unpaid_value = float(unpaid_value.split()[0].replace(',', ''))
except (ValueError, IndexError):
pass
worker_data["total_earnings"] = unpaid_value
def adjust_worker_instances(self, worker_data, target_count):
"""
Adjust the number of worker instances to match the target count.
Args:
worker_data (dict): Worker data containing worker instances
target_count (int): Target number of worker instances
"""
current_workers = worker_data.get("workers", [])
current_count = len(current_workers)
if current_count == target_count:
return
if current_count < target_count:
# Need to add more workers
workers_to_add = target_count - current_count
# Get existing online/offline worker counts
online_workers = [w for w in current_workers if w["status"] == "online"]
offline_workers = [w for w in current_workers if w["status"] == "offline"]
# Use the same online/offline ratio for new workers
online_ratio = len(online_workers) / max(1, current_count)
new_online = round(workers_to_add * online_ratio)
new_offline = workers_to_add - new_online
# Copy and adjust existing workers to create new ones
if online_workers and new_online > 0:
for i in range(new_online):
# Pick a random online worker as template
template = random.choice(online_workers).copy()
# Give it a new name to avoid duplicates
template["name"] = f"{template['name']}_{current_count + i + 1}"
current_workers.append(template)
if offline_workers and new_offline > 0:
for i in range(new_offline):
# Pick a random offline worker as template
template = random.choice(offline_workers).copy()
# Give it a new name to avoid duplicates
template["name"] = f"{template['name']}_{current_count + new_online + i + 1}"
current_workers.append(template)
# If no existing workers of either type, create new ones from scratch
if not online_workers and new_online > 0:
for i in range(new_online):
worker = self.create_default_worker(f"Miner_{current_count + i + 1}", "online")
current_workers.append(worker)
if not offline_workers and new_offline > 0:
for i in range(new_offline):
worker = self.create_default_worker(f"Miner_{current_count + new_online + i + 1}", "offline")
current_workers.append(worker)
elif current_count > target_count:
# Need to remove some workers
workers_to_remove = current_count - target_count
# Remove workers from the end of the list to preserve earlier ones
worker_data["workers"] = current_workers[:target_count]
# Update the worker data
worker_data["workers"] = current_workers
def create_default_worker(self, name, status):
"""
Create a default worker with given name and status.
Args:
name (str): Worker name
status (str): Worker status ('online' or 'offline')
Returns:
dict: Default worker data
"""
is_online = status == "online"
current_time = datetime.now(ZoneInfo(get_timezone()))
# Generate some reasonable hashrate and other values
hashrate = round(random.uniform(50, 100), 2) if is_online else 0
last_share = current_time.strftime("%Y-%m-%d %H:%M") if is_online else (
(current_time - timedelta(hours=random.uniform(1, 24))).strftime("%Y-%m-%d %H:%M")
)
return {
"name": name,
"status": status,
"type": "ASIC",
"model": "Default Miner",
"hashrate_60sec": hashrate if is_online else 0,
"hashrate_60sec_unit": "TH/s",
"hashrate_3hr": hashrate if is_online else round(random.uniform(30, 80), 2),
"hashrate_3hr_unit": "TH/s",
"efficiency": round(random.uniform(80, 95), 1) if is_online else 0,
"last_share": last_share,
"earnings": round(random.uniform(0.0001, 0.001), 8),
"power_consumption": round(random.uniform(2000, 3500)) if is_online else 0,
"temperature": round(random.uniform(55, 75)) if is_online else 0
}
def generate_fallback_data(self, cached_metrics):
"""
Generate fallback worker data from cached metrics when real data can't be fetched.
Try to preserve real worker names if available.
Args:
cached_metrics (dict): Cached metrics from the dashboard
Returns:
dict: Generated worker data
"""
# If metrics aren't available yet, return default data
if not cached_metrics:
logging.warning("No cached metrics available for worker fallback data")
return self.generate_default_workers_data()
# Check if we have workers_hashing information
workers_count = cached_metrics.get("workers_hashing")
# Handle None value for workers_count
if workers_count is None:
logging.warning("No workers_hashing value in cached metrics, defaulting to 1 worker")
workers_count = 1
# Force at least 1 worker if the count is 0
elif workers_count <= 0:
logging.warning("No workers reported in metrics, forcing 1 worker")
workers_count = 1
# Get hashrate from cached metrics
original_hashrate_3hr = float(cached_metrics.get("hashrate_3hr", 0) or 0)
hashrate_unit = cached_metrics.get("hashrate_3hr_unit", "TH/s")
# If hashrate is 0, set a minimum value to avoid empty display
if original_hashrate_3hr <= 0:
original_hashrate_3hr = 50.0
logging.warning(f"Hashrate was 0, setting minimum value of {original_hashrate_3hr} {hashrate_unit}")
# Check if we have any previously cached real worker names
real_worker_names = []
if self.worker_data_cache and self.worker_data_cache.get('workers'):
for worker in self.worker_data_cache['workers']:
name = worker.get('name', '')
# Only use names that don't look like status indicators
if name and name.lower() not in ['online', 'offline', 'total']:
real_worker_names.append(name)
# Generate worker data
workers_data = []
# If we have real worker names, use them
if real_worker_names:
logging.info(f"Using {len(real_worker_names)} real worker names from cache")
workers_data = self.generate_simulated_workers(
workers_count,
original_hashrate_3hr,
hashrate_unit,
real_worker_names=real_worker_names
)
else:
# Otherwise use sequential names
logging.info("No real worker names available, using sequential names")
workers_data = self.generate_sequential_workers(
workers_count,
original_hashrate_3hr,
hashrate_unit
)
# Calculate basic statistics
workers_online = len([w for w in workers_data if w['status'] == 'online'])
workers_offline = len(workers_data) - workers_online
# Use unpaid_earnings from main dashboard
unpaid_earnings = cached_metrics.get("unpaid_earnings", 0)
# Handle case where unpaid_earnings might be a string
if isinstance(unpaid_earnings, str):
try:
# Handle case where it might include "BTC" or other text
unpaid_earnings = float(unpaid_earnings.split()[0].replace(',', ''))
except (ValueError, IndexError):
unpaid_earnings = 0.001
# Ensure we have a minimum value for unpaid earnings
if unpaid_earnings <= 0:
unpaid_earnings = 0.001
# Use unpaid_earnings as total_earnings
total_earnings = unpaid_earnings
# ---- IMPORTANT FIX: Daily sats calculation ----
# Get daily_mined_sats directly from cached metrics
daily_sats = cached_metrics.get("daily_mined_sats", 0)
# If daily_sats is missing or zero, try to calculate it from other available metrics
if daily_sats is None or daily_sats == 0:
logging.warning("daily_mined_sats is missing or zero, attempting alternative calculations")
# Try to calculate from daily_btc_net
if cached_metrics.get("daily_btc_net") is not None:
daily_btc_net = cached_metrics.get("daily_btc_net")
daily_sats = int(round(daily_btc_net * self.sats_per_btc))
logging.info(f"Calculated daily_sats from daily_btc_net: {daily_sats}")
# Alternative calculation from estimated_earnings_per_day
elif cached_metrics.get("estimated_earnings_per_day") is not None:
daily_btc = cached_metrics.get("estimated_earnings_per_day")
daily_sats = int(round(daily_btc * self.sats_per_btc))
logging.info(f"Calculated daily_sats from estimated_earnings_per_day: {daily_sats}")
# If still zero, try to use estimated_earnings_per_day_sats directly
elif cached_metrics.get("estimated_earnings_per_day_sats") is not None:
daily_sats = cached_metrics.get("estimated_earnings_per_day_sats")
logging.info(f"Using estimated_earnings_per_day_sats as fallback: {daily_sats}")
logging.info(f"Final daily_sats value: {daily_sats}")
# Create hashrate history based on arrow_history if available
hashrate_history = []
if cached_metrics.get("arrow_history") and cached_metrics["arrow_history"].get("hashrate_3hr"):
hashrate_history = cached_metrics["arrow_history"]["hashrate_3hr"]
result = {
"workers": workers_data,
"workers_total": len(workers_data),
"workers_online": workers_online,
"workers_offline": workers_offline,
"total_hashrate": original_hashrate_3hr,
"hashrate_unit": hashrate_unit,
"total_earnings": total_earnings,
"daily_sats": daily_sats, # Fixed daily_sats value
"hashrate_history": hashrate_history,
"timestamp": datetime.now(ZoneInfo(get_timezone())).isoformat()
}
# Update cache
self.worker_data_cache = result
self.last_worker_data_update = datetime.now().timestamp()
logging.info(f"Generated fallback data with {len(workers_data)} workers")
return result
def generate_sequential_workers(self, num_workers, total_hashrate, hashrate_unit, total_unpaid_earnings=None):
"""
Generate workers with sequential names when other methods fail.
Args:
num_workers (int): Number of workers
total_hashrate (float): Total hashrate
hashrate_unit (str): Hashrate unit
total_unpaid_earnings (float, optional): Total unpaid earnings
Returns:
list: List of worker data dictionaries
"""
logging.info(f"Generating {num_workers} workers with sequential names")
# Ensure we have at least 1 worker
num_workers = max(1, num_workers)
# Worker model types for simulation
models = [
{"type": "ASIC", "model": "Bitmain Antminer S19 Pro", "max_hashrate": 110, "power": 3250},
{"type": "ASIC", "model": "Bitmain Antminer T21", "max_hashrate": 130, "power": 3276},
{"type": "ASIC", "model": "Bitmain Antminer S19j Pro", "max_hashrate": 104, "power": 3150},
{"type": "Bitaxe", "model": "Bitaxe Gamma 601", "max_hashrate": 3.2, "power": 35}
]
# Calculate hashrate distribution - majority of hashrate to online workers
online_count = max(1, int(num_workers * 0.8)) # At least 1 online worker
offline_count = num_workers - online_count
# Average hashrate per online worker (ensure it's at least 0.5 TH/s)
avg_hashrate = max(0.5, total_hashrate / online_count if online_count > 0 else 0)
workers = []
current_time = datetime.now(ZoneInfo(get_timezone()))
# Default total unpaid earnings if not provided
if total_unpaid_earnings is None or total_unpaid_earnings <= 0:
total_unpaid_earnings = 0.001 # Default small amount
# Generate online workers with sequential names
for i in range(online_count):
# Select a model based on hashrate
model_info = models[0] if avg_hashrate > 50 else models[-1] if avg_hashrate < 5 else random.choice(models)
# For Antminers and regular ASICs, use ASIC model
if i < online_count - 1 or avg_hashrate > 5:
model_idx = random.randint(0, len(models) - 2) # Exclude Bitaxe for most workers
else:
model_idx = len(models) - 1 # Bitaxe for last worker if small hashrate
model_info = models[model_idx]
# Generate hashrate with some random variation
base_hashrate = min(model_info["max_hashrate"], avg_hashrate * random.uniform(0.5, 1.5))
hashrate_60sec = round(base_hashrate * random.uniform(0.9, 1.1), 2)
hashrate_3hr = round(base_hashrate * random.uniform(0.85, 1.0), 2)
# Generate last share time (within last 5 minutes)
minutes_ago = random.randint(0, 5)
last_share = (current_time - timedelta(minutes=minutes_ago)).strftime("%Y-%m-%d %H:%M")
# Generate temperature (normal operating range)
temperature = random.randint(55, 70) if model_info["type"] == "ASIC" else random.randint(45, 55)
# Create a sequential name
name = f"Miner_{i+1}"
workers.append({
"name": name,
"status": "online",
"type": model_info["type"],
"model": model_info["model"],
"hashrate_60sec": hashrate_60sec,
"hashrate_60sec_unit": hashrate_unit,
"hashrate_3hr": hashrate_3hr,
"hashrate_3hr_unit": hashrate_unit,
"efficiency": round(random.uniform(65, 95), 1),
"last_share": last_share,
"earnings": 0, # Will be set after all workers are generated
"power_consumption": model_info["power"],
"temperature": temperature
})
# Generate offline workers
for i in range(offline_count):
# Select a model - more likely to be Bitaxe for offline
if random.random() > 0.6:
model_info = models[-1] # Bitaxe
else:
model_info = random.choice(models[:-1]) # ASIC
# Generate last share time (0.5 to 8 hours ago)
hours_ago = random.uniform(0.5, 8)
last_share = (current_time - timedelta(hours=hours_ago)).strftime("%Y-%m-%d %H:%M")
# Generate hashrate (historical before going offline)
if model_info["type"] == "Bitaxe":
hashrate_3hr = round(random.uniform(1, 3), 2)
else:
hashrate_3hr = round(random.uniform(20, 90), 2)
# Create a sequential name
idx = i + online_count # Index for offline workers starts after online workers
name = f"Miner_{idx+1}"
workers.append({
"name": name,
"status": "offline",
"type": model_info["type"],
"model": model_info["model"],
"hashrate_60sec": 0,
"hashrate_60sec_unit": hashrate_unit,
"hashrate_3hr": hashrate_3hr,
"hashrate_3hr_unit": hashrate_unit,
"efficiency": 0,
"last_share": last_share,
"earnings": 0, # Minimal earnings for offline workers
"power_consumption": 0,
"temperature": 0
})
# Distribute earnings based on hashrate proportion
# Reserve a small portion (5%) of earnings for offline workers
online_earnings_pool = total_unpaid_earnings * 0.95
offline_earnings_pool = total_unpaid_earnings * 0.05
# Distribute earnings based on hashrate proportion for online workers
total_effective_hashrate = sum(w["hashrate_3hr"] for w in workers if w["status"] == "online")
if total_effective_hashrate > 0:
for worker in workers:
if worker["status"] == "online":
hashrate_proportion = worker["hashrate_3hr"] / total_effective_hashrate
worker["earnings"] = round(online_earnings_pool * hashrate_proportion, 8)
# Distribute minimal earnings to offline workers
if offline_count > 0:
offline_per_worker = offline_earnings_pool / offline_count
for worker in workers:
if worker["status"] == "offline":
worker["earnings"] = round(offline_per_worker, 8)
logging.info(f"Generated {len(workers)} workers with sequential names")
return workers
def generate_simulated_workers(self, num_workers, total_hashrate, hashrate_unit, total_unpaid_earnings=None, real_worker_names=None):
"""
Generate simulated worker data based on total hashrate.
This is a fallback method used when real data can't be fetched.
Args:
num_workers (int): Number of workers
total_hashrate (float): Total hashrate
hashrate_unit (str): Hashrate unit
total_unpaid_earnings (float, optional): Total unpaid earnings
real_worker_names (list, optional): List of real worker names to use instead of random names
Returns:
list: List of worker data dictionaries
"""
# Ensure we have at least 1 worker
num_workers = max(1, num_workers)
# Worker model types for simulation
models = [
{"type": "ASIC", "model": "Bitmain Antminer S19k Pro", "max_hashrate": 110, "power": 3250},
{"type": "ASIC", "model": "Bitmain Antminer T21", "max_hashrate": 130, "power": 3276},
{"type": "ASIC", "model": "Bitmain Antminer S19j Pro", "max_hashrate": 104, "power": 3150},
{"type": "Bitaxe", "model": "Bitaxe Gamma 601", "max_hashrate": 3.2, "power": 35}
]
# Worker names for simulation - only used if no real worker names are provided
prefixes = ["Antminer", "Miner", "Rig", "Node", "Worker", "BitAxe", "BTC"]
# Calculate hashrate distribution - majority of hashrate to online workers
online_count = max(1, int(num_workers * 0.8)) # At least 1 online worker
offline_count = num_workers - online_count
# Average hashrate per online worker (ensure it's at least 0.5 TH/s)
avg_hashrate = max(0.5, total_hashrate / online_count if online_count > 0 else 0)
workers = []
current_time = datetime.now(ZoneInfo(get_timezone()))
# Default total unpaid earnings if not provided
if total_unpaid_earnings is None or total_unpaid_earnings <= 0:
total_unpaid_earnings = 0.001 # Default small amount
# Prepare name list - use real names if available, otherwise will generate random names
# If we have real names but not enough, we'll reuse them or generate additional random ones
name_list = []
if real_worker_names and len(real_worker_names) > 0:
logging.info(f"Using {len(real_worker_names)} real worker names")
# Ensure we have enough names by cycling through the list if needed
name_list = real_worker_names * (num_workers // len(real_worker_names) + 1)
name_list = name_list[:num_workers] # Truncate to exact number needed
# Generate online workers
for i in range(online_count):
# Select a model based on hashrate
model_info = models[0] if avg_hashrate > 50 else models[-1] if avg_hashrate < 5 else random.choice(models)
# For Antminers and regular ASICs, use ASIC model
if i < online_count - 1 or avg_hashrate > 5:
model_idx = random.randint(0, len(models) - 2) # Exclude Bitaxe for most workers
else:
model_idx = len(models) - 1 # Bitaxe for last worker if small hashrate
model_info = models[model_idx]
# Generate hashrate with some random variation
base_hashrate = min(model_info["max_hashrate"], avg_hashrate * random.uniform(0.5, 1.5))
hashrate_60sec = round(base_hashrate * random.uniform(0.9, 1.1), 2)
hashrate_3hr = round(base_hashrate * random.uniform(0.85, 1.0), 2)
# Generate last share time (within last 3 minutes)
minutes_ago = random.randint(0, 3)
last_share = (current_time - timedelta(minutes=minutes_ago)).strftime("%Y-%m-%d %H:%M")
# Generate temperature (normal operating range)
temperature = random.randint(55, 70) if model_info["type"] == "ASIC" else random.randint(45, 55)
# Use a real name if available, otherwise generate a random name
if name_list and i < len(name_list):
name = name_list[i]
else:
# Create a unique name
if model_info["type"] == "Bitaxe":
name = f"{prefixes[-1]}{random.randint(1, 99):02d}"
else:
name = f"{random.choice(prefixes[:-1])}{random.randint(1, 99):02d}"
workers.append({
"name": name,
"status": "online",
"type": model_info["type"],
"model": model_info["model"],
"hashrate_60sec": hashrate_60sec,
"hashrate_60sec_unit": hashrate_unit,
"hashrate_3hr": hashrate_3hr,
"hashrate_3hr_unit": hashrate_unit,
"efficiency": round(random.uniform(65, 95), 1),
"last_share": last_share,
"earnings": 0, # Will be set after all workers are generated
"power_consumption": model_info["power"],
"temperature": temperature
})
# Generate offline workers
for i in range(offline_count):
# Select a model - more likely to be Bitaxe for offline
if random.random() > 0.6:
model_info = models[-1] # Bitaxe
else:
model_info = random.choice(models[:-1]) # ASIC
# Generate last share time (0.5 to 8 hours ago)
hours_ago = random.uniform(0.5, 8)
last_share = (current_time - timedelta(hours=hours_ago)).strftime("%Y-%m-%d %H:%M")
# Generate hashrate (historical before going offline)
if model_info["type"] == "Bitaxe":
hashrate_3hr = round(random.uniform(1, 3), 2)
else:
hashrate_3hr = round(random.uniform(20, 90), 2)
# Use a real name if available, otherwise generate a random name
idx = i + online_count # Index for offline workers starts after online workers
if name_list and idx < len(name_list):
name = name_list[idx]
else:
# Create a unique name
if model_info["type"] == "Bitaxe":
name = f"{prefixes[-1]}{random.randint(1, 99):02d}"
else:
name = f"{random.choice(prefixes[:-1])}{random.randint(1, 99):02d}"
workers.append({
"name": name,
"status": "offline",
"type": model_info["type"],
"model": model_info["model"],
"hashrate_60sec": 0,
"hashrate_60sec_unit": hashrate_unit,
"hashrate_3hr": hashrate_3hr,
"hashrate_3hr_unit": hashrate_unit,
"efficiency": 0,
"last_share": last_share,
"earnings": 0, # Minimal earnings for offline workers
"power_consumption": 0,
"temperature": 0
})
# Calculate the current sum of online worker hashrates
current_total = sum(w["hashrate_3hr"] for w in workers if w["status"] == "online")
# If we have online workers and the total doesn't match, apply a scaling factor
if online_count > 0 and abs(current_total - total_hashrate) > 0.01 and current_total > 0:
scaling_factor = total_hashrate / current_total
# Apply scaling to all online workers
for worker in workers:
if worker["status"] == "online":
# Scale the 3hr hashrate to exactly match total
worker["hashrate_3hr"] = round(worker["hashrate_3hr"] * scaling_factor, 2)
# Scale the 60sec hashrate proportionally
if worker["hashrate_60sec"] > 0:
worker["hashrate_60sec"] = round(worker["hashrate_60sec"] * scaling_factor, 2)
# Reserve a small portion (5%) of earnings for offline workers
online_earnings_pool = total_unpaid_earnings * 0.95
offline_earnings_pool = total_unpaid_earnings * 0.05
# Distribute earnings based on hashrate proportion for online workers
total_effective_hashrate = sum(w["hashrate_3hr"] for w in workers if w["status"] == "online")
if total_effective_hashrate > 0:
for worker in workers:
if worker["status"] == "online":
hashrate_proportion = worker["hashrate_3hr"] / total_effective_hashrate
worker["earnings"] = round(online_earnings_pool * hashrate_proportion, 8)
# Distribute minimal earnings to offline workers
if offline_count > 0:
offline_per_worker = offline_earnings_pool / offline_count
for worker in workers:
if worker["status"] == "offline":
worker["earnings"] = round(offline_per_worker, 8)
# Final verification - ensure total earnings match
current_total_earnings = sum(w["earnings"] for w in workers)
if abs(current_total_earnings - total_unpaid_earnings) > 0.00000001:
# Adjust the first worker to account for any rounding errors
adjustment = total_unpaid_earnings - current_total_earnings
for worker in workers:
if worker["status"] == "online":
worker["earnings"] = round(worker["earnings"] + adjustment, 8)
break
return workers