* first commit

* Addition of a personalized activity manager and refactoring of the logic of activities

* Adding diagnostics management, including screenshot and HTML content, as well as improvements to humanize page interactions and +.

* Adding the management of newspapers and webhook settings, including filtering messages and improving the structure of the summaries sent.

* Adding a post-execution auto-date functionality, including options to update via Git and Docker, as well as a new configuration interface to manage these parameters.

* Adding accounts in Docker, with options to use an environmental file or online JSON data, as well as minimum validations for responsible accounts.

* Improving the Microsoft Rewards script display with a new headband and better log management, including colors and improved formatting for the console.

* v2

* Refactor ESLint configuration and scripts for improved TypeScript support and project structure

* Addition of the detection of suspended accounts with the gesture of the improved errors and journalization of banishment reasons

* Adding an integrated planner for programmed task execution, with configuration in Config.json and +

* Edit

* Remove texte

* Updating of documentation and adding the management of humanization in the configuration and +.

* Adding manual purchase method allowing users to spend points without automation, with monitoring of expenses and notifications.

* Correction of documentation and improvement of configuration management for manual purchase mode, adding complete documentation and appropriate banner display.

* Add comprehensive documentation for job state persistence, NTFY notifications, proxy configuration, scheduling, and auto-update features

- Introduced job state persistence documentation to track progress and resume tasks.
- Added NTFY push notifications integration guide for real-time alerts.
- Documented proxy configuration options for enhanced privacy and network management.
- Included scheduling configuration for automated script execution.
- Implemented auto-update configuration to keep installations current with Git and Docker options.

* Ajout d'Unt Système de Rapport d'Erreurs Communautaire pour Améliorerer le Débogage, incluant la Configuration et l'Envoi de Résumés D'Erreurs Anonyés à un webhook Discord.

* Mini Edit

* Mise à Jour du Readme.md pour Améliorerer la Présentation et La Claté, Ajout d'Un section sur les notifications en Temps Raine et Mise à Jour des badges pour la meille unibilité.

* Documentation update

* Edit README.md

* Edit

* Update README with legacy version link

* Improvement of location data management and webhooks, adding configurations normalization

* Force update for PR

* Improvement of documentation and configuration options for Cron integration and Docker use

* Improvement of planning documentation and adding a multi-pan-pancake in the daily execution script

* Deletion of the CommunityReport functionality in accordance with the project policy

* Addition of randomization of start -up schedules and surveillance time for planner executions

* Refactor Docker setup to use built-in scheduler, removing cron dependencies and simplifying configuration options

* Adding TOTP support for authentication, update of interfaces and configuration files to include Totp secret, and automatic generation of the Totp code when connecting.

* Fix [LOGIN-NO-PROMPT] No dialogs (xX)

* Reset the Totp field for email_1 in the accounts.example.json file

* Reset the Totp field for email_1 in the Readme.md file

* Improvement of Bing Research: Use of the 'Attacked' method for the research field, management of overlays and adding direct navigation in the event of entry failure.

* Adding a complete security policy, including directives on vulnerability management, coordinated disclosure and user security advice.

* Remove advanced environment variables section from README

* Configuration and dockerfile update: Passage to Node 22, addition of management of the purchase method, deletion of obsolete scripts

* Correction of the order of the sections in the Readme.md for better readability

* Update of Readm and Security Policy: Addition of the method of purchase and clarification of security and confidentiality practices.

* Improvement of the readability of the Readm and deletion of the mention of reporting of vulnerabilities in the security document.

* Addition of humanization management and adaptive throttling to simulate more human behavior in bot activities.

* Addition of humanization management: activation/deactivation of human gestures, configuration update and adding documentation on human mode.

* Deletion of community error report functionality to respect the privacy policy

* Addition of immediate banning alerts and vacation configuration in the Microsoft Rewards bot

* Addition of immediate banning alerts and vacation configuration in the Microsoft Rewards bot

* Added scheduling support: support for 12h and 24h formats, added options for time zone, and immediate execution on startup.

* Added window size normalization and page rendering to fit typical screens, with injected CSS styles to prevent excessive zooming.

* Added security incident management: detection of hidden recovery emails, automation blocking, and global alerts. Updated configuration files and interfaces to include recovery emails. Improved security incident documentation.

* Refactor incident alert handling: unified alert sender

* s

* Added security incident management: detect recovery email inconsistencies and send unified alerts. Implemented helper methods to manage alerts and compromised modes.

* Added heartbeat management for the scheduler: integrated a heartbeat file to report liveliness and adjusted the watchdog configuration to account for heartbeat updates.

* Edit webook

* Updated security alert management: fixed the recovery email hidden in the documentation and enabled the conclusion webhook for notifications.

* Improved security alert handling: added structured sending to webhooks for better visibility and updated callback interval in compromised mode.

* Edit conf

* Improved dependency installation: Added the --ignore-scripts option for npm ci and npm install. Updated comments in compose.yaml for clarity.

* Refactor documentation structure and enhance logging:
- Moved documentation files from 'information' to 'docs' directory for better organization.
- Added live logging configuration to support webhook logs with email redaction.
- Updated file paths in configuration and loading functions to accommodate new structure.
- Adjusted scheduler behavior to prevent immediate runs unless explicitly set.
- Improved error handling for account and config file loading.
- Enhanced security incident documentation with detailed recovery steps.

* Fix docs

* Remove outdated documentation on NTFY, Proxy, Scheduling, Security, and Auto-Update configurations; update Browser class to prioritize headless mode based on environment variable.

* Addition of documentation for account management and Totp, Docker Guide, and Update of the Documentation Index.

* Updating Docker documentation: simplification of instructions and adding links to detailed guides. Revision of configuration options and troubleshooting sections.

* Edit

* Edit docs

* Enhance documentation for Scheduler, Security, and Auto-Update features

- Revamped the Scheduler documentation to include detailed features, configuration options, and usage examples.
- Expanded the Security guide with comprehensive incident response strategies, privacy measures, and monitoring practices.
- Updated the Auto-Update section to clarify configuration, methods, and best practices for maintaining system integrity.

* Improved error handling and added crash recovery in the Microsoft Rewards bot. Added configuration for automatic restart and handling of local search queries when trends fail.

* Fixed initial point counting in MicrosoftRewardsBot and improved error handling when sending summaries to webhooks.

* Added unified support for notifications and improved handling of webhook configurations in the normalizeConfig and log functions.

* UPDATE LOGIN

* EDIT LOGIN

* Improved login error handling: added recovery mismatch detection and the ability to switch to password authentication.

* Added a full reference to configuration in the documentation and improved log and error handling in the code.

* Added context management for conclusion webhooks and improved user configuration for notifications.

* Mini edit

* Improved logic for extracting masked emails for more accurate matching during account recovery.
This commit is contained in:
Light
2025-09-26 18:58:33 +02:00
committed by GitHub
parent 02160a07d9
commit 15f62963f8
60 changed files with 8186 additions and 1355 deletions

View File

@@ -1,24 +1,31 @@
[
{
"_note": "Microsoft allows up to 3 new accounts per IP per day; in practice it is recommended not to exceed around 5 active accounts per household IP to avoid looking suspicious; there is no official lifetime cap, but creating too many accounts quickly may trigger verification (phone, OTP, captcha); Microsoft discourages bulk account creation from the same IP; unusual activity can result in temporary blocks or account restrictions.",
"accounts": [
{
"email": "email_1",
"password": "password_1",
"proxy": {
"proxyAxios": true,
"url": "",
"port": 0,
"username": "",
"password": ""
}
"email": "email_1",
"password": "password_1",
"totp": "",
"recoveryEmail": "your_email@domain.com",
"proxy": {
"proxyAxios": true,
"url": "",
"port": 0,
"username": "",
"password": ""
}
},
{
"email": "email_2",
"password": "password_2",
"proxy": {
"proxyAxios": true,
"url": "",
"port": 0,
"username": "",
"password": ""
}
"email": "email_2",
"password": "password_2",
"totp": "",
"recoveryEmail": "your_email@domain.com",
"proxy": {
"proxyAxios": true,
"url": "",
"port": 0,
"username": "",
"password": ""
}
}
]
]
}

View File

@@ -29,16 +29,25 @@ class Browser {
if (process.env.AUTO_INSTALL_BROWSERS === '1') {
try {
// Dynamically import child_process to avoid overhead otherwise
const { execSync } = await import('child_process') as any
const { execSync } = await import('child_process')
execSync('npx playwright install chromium', { stdio: 'ignore' })
} catch { /* silent */ }
}
let browser: any
let browser: import('rebrowser-playwright').Browser
// Support both legacy and new config structures (wider scope for later usage)
const cfgAny = this.bot.config as unknown as Record<string, unknown>
try {
// FORCE_HEADLESS env takes precedence (used in Docker with headless shell only)
const envForceHeadless = process.env.FORCE_HEADLESS === '1'
const headlessValue = envForceHeadless ? true : ((cfgAny['headless'] as boolean | undefined) ?? (cfgAny['browser'] && (cfgAny['browser'] as Record<string, unknown>)['headless'] as boolean | undefined) ?? false)
const headless: boolean = Boolean(headlessValue)
const engineName = 'chromium' // current hard-coded engine
this.bot.log(this.bot.isMobile, 'BROWSER', `Launching ${engineName} (headless=${headless})`) // explicit engine log
browser = await playwright.chromium.launch({
//channel: 'msedge', // Uses Edge instead of chrome
headless: this.bot.config.headless,
headless,
...(proxy.url && { proxy: { username: proxy.username, password: proxy.password, server: `${proxy.url}:${proxy.port}` } }),
args: [
'--no-sandbox',
@@ -49,7 +58,7 @@ class Browser {
'--ignore-ssl-errors'
]
})
} catch (e: any) {
} catch (e: unknown) {
const msg = (e instanceof Error ? e.message : String(e))
// Common missing browser executable guidance
if (/Executable doesn't exist/i.test(msg)) {
@@ -60,18 +69,57 @@ class Browser {
throw e
}
const sessionData = await loadSessionData(this.bot.config.sessionPath, email, this.bot.isMobile, this.bot.config.saveFingerprint)
// Resolve saveFingerprint from legacy root or new fingerprinting.saveFingerprint
const fpConfig = (cfgAny['saveFingerprint'] as unknown) || ((cfgAny['fingerprinting'] as Record<string, unknown> | undefined)?.['saveFingerprint'] as unknown)
const saveFingerprint: { mobile: boolean; desktop: boolean } = (fpConfig as { mobile: boolean; desktop: boolean }) || { mobile: false, desktop: false }
const sessionData = await loadSessionData(this.bot.config.sessionPath, email, this.bot.isMobile, saveFingerprint)
const fingerprint = sessionData.fingerprint ? sessionData.fingerprint : await this.generateFingerprint()
const context = await newInjectedContext(browser as any, { fingerprint: fingerprint })
const context = await newInjectedContext(browser as unknown as import('playwright').Browser, { fingerprint: fingerprint })
// Set timeout to preferred amount
context.setDefaultTimeout(this.bot.utils.stringToMs(this.bot.config?.globalTimeout ?? 30000))
// Set timeout to preferred amount (supports legacy globalTimeout or browser.globalTimeout)
const globalTimeout = (cfgAny['globalTimeout'] as unknown) ?? ((cfgAny['browser'] as Record<string, unknown> | undefined)?.['globalTimeout'] as unknown) ?? 30000
context.setDefaultTimeout(this.bot.utils.stringToMs(globalTimeout as (number | string)))
// Normalize viewport and page rendering so content fits typical screens
try {
const desktopViewport = { width: 1280, height: 800 }
const mobileViewport = { width: 390, height: 844 }
context.on('page', async (page) => {
try {
// Set a reasonable viewport size depending on device type
if (this.bot.isMobile) {
await page.setViewportSize(mobileViewport)
} else {
await page.setViewportSize(desktopViewport)
}
// Inject a tiny CSS to avoid gigantic scaling on some environments
await page.addInitScript(() => {
try {
const style = document.createElement('style')
style.id = '__mrs_fit_style'
style.textContent = `
html, body { overscroll-behavior: contain; }
/* Mild downscale to keep content within window on very large DPI */
@media (min-width: 1000px) {
html { zoom: 0.9 !important; }
}
`
document.documentElement.appendChild(style)
} catch { /* ignore */ }
})
} catch { /* ignore */ }
})
} catch { /* ignore */ }
await context.addCookies(sessionData.cookies)
if (this.bot.config.saveFingerprint) {
// Persist fingerprint when feature is configured
if (fpConfig) {
await saveFingerprintData(this.bot.config.sessionPath, email, this.bot.isMobile, fingerprint)
}

View File

@@ -40,10 +40,17 @@ export default class BrowserFunc {
await this.bot.utils.wait(3000)
await this.bot.browser.utils.tryDismissAllMessages(page)
// Check if account is suspended
const isSuspended = await page.waitForSelector('#suspendedAccountHeader', { state: 'visible', timeout: 2000 }).then(() => true).catch(() => false)
if (isSuspended) {
this.bot.log(this.bot.isMobile, 'GO-HOME', 'This account is suspended!', 'error')
// Check if account is suspended (multiple heuristics)
const suspendedByHeader = await page.waitForSelector('#suspendedAccountHeader', { state: 'visible', timeout: 1500 }).then(() => true).catch(() => false)
let suspendedByText = false
if (!suspendedByHeader) {
try {
const text = (await page.textContent('body')) || ''
suspendedByText = /account has been suspended|suspended due to unusual activity/i.test(text)
} catch { /* ignore */ }
}
if (suspendedByHeader || suspendedByText) {
this.bot.log(this.bot.isMobile, 'GO-HOME', 'This account appears suspended!', 'error')
throw new Error('Account has been suspended!')
}
@@ -82,21 +89,22 @@ export default class BrowserFunc {
* Fetch user dashboard data
* @returns {DashboardData} Object of user bing rewards dashboard data
*/
async getDashboardData(): Promise<DashboardData> {
async getDashboardData(page?: Page): Promise<DashboardData> {
const target = page ?? this.bot.homePage
const dashboardURL = new URL(this.bot.config.baseURL)
const currentURL = new URL(this.bot.homePage.url())
const currentURL = new URL(target.url())
try {
// Should never happen since tasks are opened in a new tab!
if (currentURL.hostname !== dashboardURL.hostname) {
this.bot.log(this.bot.isMobile, 'DASHBOARD-DATA', 'Provided page did not equal dashboard page, redirecting to dashboard page')
await this.goHome(this.bot.homePage)
await this.goHome(target)
}
let lastError: any = null
let lastError: unknown = null
for (let attempt = 1; attempt <= 2; attempt++) {
try {
// Reload the page to get new data
await this.bot.homePage.reload({ waitUntil: 'domcontentloaded' })
await target.reload({ waitUntil: 'domcontentloaded' })
lastError = null
break
} catch (re) {
@@ -108,7 +116,7 @@ export default class BrowserFunc {
if (attempt === 1) {
this.bot.log(this.bot.isMobile, 'GET-DASHBOARD-DATA', 'Page appears closed; trying one navigation fallback', 'warn')
try {
await this.goHome(this.bot.homePage)
await this.goHome(target)
} catch {/* ignore */}
} else {
break
@@ -119,7 +127,7 @@ export default class BrowserFunc {
}
}
const scriptContent = await this.bot.homePage.evaluate(() => {
const scriptContent = await target.evaluate(() => {
const scripts = Array.from(document.querySelectorAll('script'))
const targetScript = scripts.find(script => script.innerText.includes('var dashboard'))
@@ -131,7 +139,7 @@ export default class BrowserFunc {
}
// Extract the dashboard object from the script content
const dashboardData = await this.bot.homePage.evaluate((scriptContent: string) => {
const dashboardData = await target.evaluate((scriptContent: string) => {
// Extract the dashboard object using regex
const regex = /var dashboard = (\{.*?\});/s
const match = regex.exec(scriptContent)
@@ -232,8 +240,12 @@ export default class BrowserFunc {
]
const data = await this.getDashboardData()
let geoLocale = data.userProfile.attributes.country
geoLocale = (this.bot.config.searchSettings.useGeoLocaleQueries && geoLocale.length === 2) ? geoLocale.toLowerCase() : 'us'
// Guard against missing profile/attributes and undefined settings
let geoLocale = data?.userProfile?.attributes?.country || 'US'
const useGeo = !!(this.bot?.config?.searchSettings?.useGeoLocaleQueries)
geoLocale = (useGeo && typeof geoLocale === 'string' && geoLocale.length === 2)
? geoLocale.toLowerCase()
: 'us'
const userDataRequest: AxiosRequestConfig = {
url: 'https://prod.rewardsplatform.microsoft.com/dapi/me?channel=SAAndroid&options=613',
@@ -295,9 +307,10 @@ export default class BrowserFunc {
const html = await page.content()
const $ = load(html)
const scriptContent = $('script').filter((index: number, element: any) => {
return $(element).text().includes('_w.rewardsQuizRenderInfo')
}).text()
const scriptContent = $('script')
.toArray()
.map(el => $(el).text())
.find(t => t.includes('_w.rewardsQuizRenderInfo')) || ''
if (scriptContent) {
const regex = /_w\.rewardsQuizRenderInfo\s*=\s*({.*?});/s
@@ -355,7 +368,10 @@ export default class BrowserFunc {
const html = await page.content()
const $ = load(html)
const element = $('.offer-cta').toArray().find((x: any) => x.attribs.href?.includes(activity.offerId))
const element = $('.offer-cta').toArray().find((x: unknown) => {
const el = x as { attribs?: { href?: string } }
return !!el.attribs?.href?.includes(activity.offerId)
})
if (element) {
selector = `a[href*="${element.attribs.href}"]`
}

View File

@@ -12,52 +12,57 @@ export default class BrowserUtil {
}
async tryDismissAllMessages(page: Page): Promise<void> {
const buttons = [
const attempts = 3
const buttonGroups: { selector: string; label: string; isXPath?: boolean }[] = [
{ selector: '#acceptButton', label: 'AcceptButton' },
{ selector: '.ext-secondary.ext-button', label: '"Skip for now" Button' },
{ selector: '#iLandingViewAction', label: 'iLandingViewAction' },
{ selector: '#iShowSkip', label: 'iShowSkip' },
{ selector: '#iNext', label: 'iNext' },
{ selector: '#iLooksGood', label: 'iLooksGood' },
{ selector: '#idSIButton9', label: 'idSIButton9' },
{ selector: '.ms-Button.ms-Button--primary', label: 'Primary Button' },
{ selector: '.c-glyph.glyph-cancel', label: 'Mobile Welcome Button' },
{ selector: '.maybe-later', label: 'Mobile Rewards App Banner' },
{ selector: '//div[@id="cookieConsentContainer"]//button[contains(text(), "Accept")]', label: 'Accept Cookie Consent Container', isXPath: true },
{ selector: '#bnp_btn_accept', label: 'Bing Cookie Banner' },
{ selector: '#reward_pivot_earn', label: 'Reward Coupon Accept' }
{ selector: '.optanon-allow-all, .optanon-alert-box-button', label: 'OneTrust Accept' },
{ selector: '.ext-secondary.ext-button', label: 'Skip For Now' },
{ selector: '#iLandingViewAction', label: 'Landing Continue' },
{ selector: '#iShowSkip', label: 'Show Skip' },
{ selector: '#iNext', label: 'Next' },
{ selector: '#iLooksGood', label: 'LooksGood' },
{ selector: '#idSIButton9', label: 'PrimaryLoginButton' },
{ selector: '.ms-Button.ms-Button--primary', label: 'Primary Generic' },
{ selector: '.c-glyph.glyph-cancel', label: 'Mobile Welcome Cancel' },
{ selector: '.maybe-later, button[data-automation-id*="maybeLater" i]', label: 'Maybe Later' },
{ selector: '#bnp_btn_reject', label: 'Bing Cookie Reject' },
{ selector: '#bnp_btn_accept', label: 'Bing Cookie Accept' },
{ selector: '#bnp_close_link', label: 'Bing Cookie Close' },
{ selector: '#reward_pivot_earn', label: 'Rewards Pivot Earn' },
{ selector: '//div[@id="cookieConsentContainer"]//button[contains(text(), "Accept")]', label: 'Legacy Cookie Accept', isXPath: true }
]
for (const button of buttons) {
for (let round = 0; round < attempts; round++) {
let dismissedThisRound = 0
for (const btn of buttonGroups) {
try {
const loc = btn.isXPath ? page.locator(`xpath=${btn.selector}`) : page.locator(btn.selector)
if (await loc.first().isVisible({ timeout: 200 }).catch(()=>false)) {
await loc.first().click({ timeout: 500 }).catch(()=>{})
dismissedThisRound++
this.bot.log(this.bot.isMobile, 'DISMISS-ALL-MESSAGES', `Dismissed: ${btn.label}`)
await page.waitForTimeout(150)
}
} catch { /* ignore */ }
}
// Special case: blocking overlay with inside buttons
try {
const element = button.isXPath ? page.locator(`xpath=${button.selector}`) : page.locator(button.selector)
await element.first().click({ timeout: 500 })
await page.waitForTimeout(500)
this.bot.log(this.bot.isMobile, 'DISMISS-ALL-MESSAGES', `Dismissed: ${button.label}`)
} catch (error) {
// Silent fail
}
}
// Handle blocking Bing privacy overlay intercepting clicks (#bnp_overlay_wrapper)
try {
const overlay = await page.locator('#bnp_overlay_wrapper').first()
if (await overlay.isVisible({ timeout: 500 }).catch(()=>false)) {
// Try common dismiss buttons inside overlay
const rejectBtn = await page.locator('#bnp_btn_reject, button[aria-label*="Reject" i]').first()
const acceptBtn = await page.locator('#bnp_btn_accept').first()
if (await rejectBtn.isVisible().catch(()=>false)) {
await rejectBtn.click({ timeout: 500 }).catch(()=>{})
this.bot.log(this.bot.isMobile, 'DISMISS-ALL-MESSAGES', 'Dismissed: Bing Overlay Reject')
} else if (await acceptBtn.isVisible().catch(()=>false)) {
await acceptBtn.click({ timeout: 500 }).catch(()=>{})
this.bot.log(this.bot.isMobile, 'DISMISS-ALL-MESSAGES', 'Dismissed: Bing Overlay Accept (fallback)')
const overlay = page.locator('#bnp_overlay_wrapper')
if (await overlay.isVisible({ timeout: 200 }).catch(()=>false)) {
const reject = overlay.locator('#bnp_btn_reject, button[aria-label*="Reject" i]')
const accept = overlay.locator('#bnp_btn_accept')
if (await reject.first().isVisible().catch(()=>false)) {
await reject.first().click({ timeout: 500 }).catch(()=>{})
this.bot.log(this.bot.isMobile, 'DISMISS-ALL-MESSAGES', 'Dismissed: Overlay Reject')
dismissedThisRound++
} else if (await accept.first().isVisible().catch(()=>false)) {
await accept.first().click({ timeout: 500 }).catch(()=>{})
this.bot.log(this.bot.isMobile, 'DISMISS-ALL-MESSAGES', 'Dismissed: Overlay Accept')
dismissedThisRound++
}
}
await page.waitForTimeout(300)
}
} catch { /* ignore */ }
} catch { /* ignore */ }
if (dismissedThisRound === 0) break // nothing new dismissed -> stop early
}
}
async getLatestTab(page: Page): Promise<Page> {
@@ -78,40 +83,6 @@ export default class BrowserUtil {
}
}
async getTabs(page: Page) {
try {
const browser = page.context()
const pages = browser.pages()
const homeTab = pages[1]
let homeTabURL: URL
if (!homeTab) {
throw this.bot.log(this.bot.isMobile, 'GET-TABS', 'Home tab could not be found!', 'error')
} else {
homeTabURL = new URL(homeTab.url())
if (homeTabURL.hostname !== 'rewards.bing.com') {
throw this.bot.log(this.bot.isMobile, 'GET-TABS', 'Reward page hostname is invalid: ' + homeTabURL.host, 'error')
}
}
const workerTab = pages[2]
if (!workerTab) {
throw this.bot.log(this.bot.isMobile, 'GET-TABS', 'Worker tab could not be found!', 'error')
}
return {
homeTab: homeTab,
workerTab: workerTab
}
} catch (error) {
throw this.bot.log(this.bot.isMobile, 'GET-TABS', 'An error occurred:' + error, 'error')
}
}
async reloadBadPage(page: Page): Promise<void> {
try {
const html = await page.content().catch(() => '')
@@ -129,4 +100,80 @@ export default class BrowserUtil {
}
}
/**
* Perform small human-like gestures: short waits, minor mouse moves and occasional scrolls.
* This should be called sparingly between actions to avoid a fixed cadence.
*/
async humanizePage(page: Page): Promise<void> {
try {
const h = this.bot.config?.humanization || {}
if (h.enabled === false) return
const moveProb = typeof h.gestureMoveProb === 'number' ? h.gestureMoveProb : 0.4
const scrollProb = typeof h.gestureScrollProb === 'number' ? h.gestureScrollProb : 0.2
// minor mouse move
if (Math.random() < moveProb) {
const x = Math.floor(Math.random() * 30) + 5
const y = Math.floor(Math.random() * 20) + 3
await page.mouse.move(x, y, { steps: 2 }).catch(() => { })
}
// tiny scroll
if (Math.random() < scrollProb) {
const dy = (Math.random() < 0.5 ? 1 : -1) * (Math.floor(Math.random() * 150) + 50)
await page.mouse.wheel(0, dy).catch(() => { })
}
// Random short wait; override via humanization.actionDelay
const range = h.actionDelay
if (range && typeof range.min !== 'undefined' && typeof range.max !== 'undefined') {
try {
const ms = (await import('ms')).default
const min = typeof range.min === 'number' ? range.min : ms(String(range.min))
const max = typeof range.max === 'number' ? range.max : ms(String(range.max))
if (typeof min === 'number' && typeof max === 'number' && max >= min) {
await this.bot.utils.wait(this.bot.utils.randomNumber(Math.max(0, min), Math.min(max, 5000)))
} else {
await this.bot.utils.wait(this.bot.utils.randomNumber(150, 450))
}
} catch {
await this.bot.utils.wait(this.bot.utils.randomNumber(150, 450))
}
} else {
await this.bot.utils.wait(this.bot.utils.randomNumber(150, 450))
}
} catch { /* swallow */ }
}
/**
* Capture minimal diagnostics for a page: screenshot + HTML content.
* Files are written under ./reports/<date>/ with a safe label.
*/
async captureDiagnostics(page: Page, label: string): Promise<void> {
try {
const cfg = this.bot.config?.diagnostics || {}
if (cfg.enabled === false) return
const maxPerRun = typeof cfg.maxPerRun === 'number' ? cfg.maxPerRun : 8
if (!this.bot.tryReserveDiagSlot(maxPerRun)) return
const safe = label.replace(/[^a-z0-9-_]/gi, '_').slice(0, 64)
const now = new Date()
const day = `${now.getFullYear()}-${String(now.getMonth()+1).padStart(2,'0')}-${String(now.getDate()).padStart(2,'0')}`
const baseDir = `${process.cwd()}/reports/${day}`
const fs = await import('fs')
const path = await import('path')
if (!fs.existsSync(baseDir)) fs.mkdirSync(baseDir, { recursive: true })
const ts = `${String(now.getHours()).padStart(2,'0')}${String(now.getMinutes()).padStart(2,'0')}${String(now.getSeconds()).padStart(2,'0')}`
const shot = path.join(baseDir, `${ts}_${safe}.png`)
const htmlPath = path.join(baseDir, `${ts}_${safe}.html`)
if (cfg.saveScreenshot !== false) {
await page.screenshot({ path: shot }).catch(()=>{})
}
if (cfg.saveHtml !== false) {
const html = await page.content().catch(()=> '<html></html>')
fs.writeFileSync(htmlPath, html)
}
this.bot.log(this.bot.isMobile, 'DIAG', `Saved diagnostics to ${shot} and ${htmlPath}`)
} catch (e) {
this.bot.log(this.bot.isMobile, 'DIAG', `Failed to capture diagnostics: ${e instanceof Error ? e.message : e}`, 'warn')
}
}
}

View File

@@ -1,51 +1,197 @@
{
// Base URL for Rewards dashboard and APIs (do not change unless you know what you're doing)
"baseURL": "https://rewards.bing.com",
// Where to store sessions (cookies, fingerprints)
"sessionPath": "sessions",
"headless": false,
"parallel": false,
"runOnZeroPoints": false,
"clusters": 1,
"saveFingerprint": {
"mobile": false,
"desktop": false
"browser": {
// Run browser without UI (true=headless, false=visible). Visible can help with stability.
"headless": false,
// Max time to wait for common operations (supports ms/s/min: e.g. 30000, "30s", "2min")
"globalTimeout": "30s"
},
"execution": {
// Run desktop+mobile in parallel (needs more resources). If false, runs sequentially.
"parallel": false,
// If false and there are 0 points available, the run is skipped early to save time.
"runOnZeroPoints": false,
// Number of account clusters (processes) to run concurrently.
"clusters": 1,
// Number of passes per invocation (advanced; usually 1).
"passesPerRun": 1
},
"buyMode": {
// Manual purchase/redeem mode. Use CLI -buy to enable, or set buyMode.enabled in config.
// Session duration cap in minutes.
"maxMinutes": 45
},
"fingerprinting": {
// Persist browser fingerprints per device type to improve consistency across runs
"saveFingerprint": {
"mobile": false,
"desktop": false
}
},
"search": {
// Use locale-specific query sources
"useLocalQueries": false,
"settings": {
// Add geo/locale signal into query selection
"useGeoLocaleQueries": false,
// Randomly scroll search result pages to look more natural
"scrollRandomResults": true,
// Occasionally click a result (safe targets only)
"clickRandomResults": true,
// Number of times to retry mobile searches if points didnt progress
"retryMobileSearchAmount": 2,
// Delay between searches (supports numbers in ms or time strings)
"delay": {
"min": "3min",
"max": "5min"
}
}
},
"humanization": {
// Global Human Mode switch. true=adds subtle micro-gestures & pauses. false=classic behavior.
"enabled": true,
// If true, as soon as a ban is detected on any account, stop processing remaining accounts
// (ban detection is based on centralized heuristics and error signals)
"stopOnBan": true,
// If true, immediately send an alert (webhook/NTFY) when a ban is detected
"immediateBanAlert": true,
// Extra random pause between actions (ms or time string e.g., "300ms", "1s")
"actionDelay": {
"min": 150,
"max": 450
},
// Probability (0..1) to move mouse a tiny bit in between actions
"gestureMoveProb": 0.4,
// Probability (0..1) to perform a very small scroll
"gestureScrollProb": 0.2,
// Optional local-time windows for execution (e.g., ["08:30-11:00", "19:00-22:00"]).
// If provided, runs will wait until inside a window before starting.
"allowedWindows": []
},
// Optional monthly "vacation" block: skip a contiguous range of days to look more human.
// This is independent of weekly random off-days. When enabled, each month a random
// block between minDays and maxDays is selected (e.g., 35 days) and all runs within
// that date range are skipped. The chosen block is logged at the start of the month.
"vacation": {
"enabled": false,
"minDays": 3,
"maxDays": 5
},
"retryPolicy": {
// Generic retry/backoff for transient failures
"maxAttempts": 3,
"baseDelay": 1000,
"maxDelay": "30s",
"multiplier": 2,
"jitter": 0.2
},
"workers": {
// Select what the bot should complete on desktop/mobile
"doDailySet": true,
"doMorePromotions": true,
"doPunchCards": true,
"doDesktopSearch": true,
"doMobileSearch": true,
"doDailyCheckIn": true,
"doReadToEarn": true
"doReadToEarn": true,
// If true, run a desktop search bundle right after Daily Set
"bundleDailySetWithSearch": false
},
"searchOnBingLocalQueries": false,
"globalTimeout": "30s",
"searchSettings": {
"useGeoLocaleQueries": false,
"scrollRandomResults": true,
"clickRandomResults": true,
"searchDelay": {
"min": "3min",
"max": "5min"
},
"retryMobileSearchAmount": 2
},
"logExcludeFunc": [
"SEARCH-CLOSE-TABS"
],
"webhookLogExcludeFunc": [
"SEARCH-CLOSE-TABS"
],
"proxy": {
// Control which outbound calls go through your proxy
"proxyGoogleTrends": true,
"proxyBingTerms": true
},
"webhook": {
"enabled": false,
"url": ""
"notifications": {
// Live logs (Discord or similar). URL is your webhook endpoint.
"webhook": {
"enabled": false,
"url": ""
},
// Rich end-of-run summary (Discord or similar)
"conclusionWebhook": {
"enabled": false,
"url": ""
},
// NTFY push notifications (plain text)
"ntfy": {
"enabled": false,
"url": "",
"topic": "rewards",
"authToken": ""
}
},
"conclusionWebhook": {
"logging": {
// Logging controls (see docs/config.md). Remove redactEmails or set false to show full emails.
// Filter out noisy log buckets locally and for any webhook summaries
"excludeFunc": [
"SEARCH-CLOSE-TABS",
"LOGIN-NO-PROMPT",
"FLOW"
],
"webhookExcludeFunc": [
"SEARCH-CLOSE-TABS",
"LOGIN-NO-PROMPT",
"FLOW"
],
// Email redaction toggle (previously logging.live.redactEmails)
"redactEmails": true
},
"diagnostics": {
// Capture minimal evidence on failures (screenshots/HTML) and prune old runs
"enabled": true,
"saveScreenshot": true,
"saveHtml": true,
"maxPerRun": 2,
"retentionDays": 7
},
"jobState": {
// Checkpoint to avoid duplicate work across restarts
"enabled": true,
// Custom state directory (defaults to sessionPath/job-state if empty)
"dir": ""
},
"schedule": {
// Built-in scheduler (no cron needed in container). Uses the IANA time zone below.
"enabled": false,
"url": ""
// Choose YOUR preferred time format:
// - US style with AM/PM → set useAmPm: true and edit time12 (e.g., "9:00 AM")
// - 24-hour style → set useAmPm: false and edit time24 (e.g., "09:00")
// Back-compat: if both time12/time24 are empty, the legacy "time" (HH:mm) will be used if present.
"useAmPm": false,
"time12": "9:00 AM",
"time24": "09:00",
// IANA timezone for scheduling (set to your region), e.g. "Europe/Paris" or "America/New_York"
"timeZone": "America/New_York",
// If true, run one pass immediately when the process starts
"runImmediatelyOnStart": false
},
"update": {
// Optional post-run auto-update
"git": true,
"docker": false,
// Custom updater script path (relative to repo root)
"scriptPath": "setup/update/update.mjs"
}
}

View File

@@ -1,2 +0,0 @@
# Run automation according to CRON_SCHEDULE; redirect both stdout & stderr to Docker logs
${CRON_SCHEDULE} TZ=${TZ} /bin/bash /usr/src/microsoft-rewards-script/src/run_daily.sh >> /proc/1/fd/1 2>&1

View File

@@ -13,15 +13,109 @@ import { ReadToEarn } from './activities/ReadToEarn'
import { DailyCheckIn } from './activities/DailyCheckIn'
import { DashboardData, MorePromotion, PromotionalItem } from '../interface/DashboardData'
import type { ActivityHandler } from '../interface/ActivityHandler'
type ActivityKind =
| { type: 'poll' }
| { type: 'abc' }
| { type: 'thisOrThat' }
| { type: 'quiz' }
| { type: 'urlReward' }
| { type: 'searchOnBing' }
| { type: 'unsupported' }
export default class Activities {
private bot: MicrosoftRewardsBot
private handlers: ActivityHandler[] = []
constructor(bot: MicrosoftRewardsBot) {
this.bot = bot
}
// Register external/custom handlers (optional extension point)
registerHandler(handler: ActivityHandler) {
this.handlers.push(handler)
}
// Centralized dispatcher for activities from dashboard/punchcards
async run(page: Page, activity: MorePromotion | PromotionalItem): Promise<void> {
// First, try custom handlers (if any)
for (const h of this.handlers) {
try {
if (h.canHandle(activity)) {
await h.run(page, activity)
return
}
} catch (e) {
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Custom handler ${(h.id || 'unknown')} failed: ${e instanceof Error ? e.message : e}`, 'error')
}
}
const kind = this.classifyActivity(activity)
try {
switch (kind.type) {
case 'poll':
await this.doPoll(page)
break
case 'abc':
await this.doABC(page)
break
case 'thisOrThat':
await this.doThisOrThat(page)
break
case 'quiz':
await this.doQuiz(page)
break
case 'searchOnBing':
await this.doSearchOnBing(page, activity)
break
case 'urlReward':
await this.doUrlReward(page)
break
default:
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Skipped activity "${activity.title}" | Reason: Unsupported type: "${String((activity as { promotionType?: string }).promotionType)}"!`, 'warn')
break
}
} catch (e) {
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Dispatcher error for "${activity.title}": ${e instanceof Error ? e.message : e}`, 'error')
}
}
public getTypeLabel(activity: MorePromotion | PromotionalItem): string {
const k = this.classifyActivity(activity)
switch (k.type) {
case 'poll': return 'Poll'
case 'abc': return 'ABC'
case 'thisOrThat': return 'ThisOrThat'
case 'quiz': return 'Quiz'
case 'searchOnBing': return 'SearchOnBing'
case 'urlReward': return 'UrlReward'
default: return 'Unsupported'
}
}
private classifyActivity(activity: MorePromotion | PromotionalItem): ActivityKind {
const type = (activity.promotionType || '').toLowerCase()
if (type === 'quiz') {
// Distinguish Poll/ABC/ThisOrThat vs general quiz using current heuristics
const max = activity.pointProgressMax
const url = (activity.destinationUrl || '').toLowerCase()
if (max === 10) {
if (url.includes('pollscenarioid')) return { type: 'poll' }
return { type: 'abc' }
}
if (max === 50) return { type: 'thisOrThat' }
return { type: 'quiz' }
}
if (type === 'urlreward') {
const name = (activity.name || '').toLowerCase()
if (name.includes('exploreonbing')) return { type: 'searchOnBing' }
return { type: 'urlReward' }
}
return { type: 'unsupported' }
}
doSearch = async (page: Page, data: DashboardData): Promise<void> => {
const search = new Search(this.bot)
await search.doSearch(page, data)

File diff suppressed because it is too large Load Diff

View File

@@ -3,19 +3,30 @@ import { Page } from 'rebrowser-playwright'
import { DashboardData, MorePromotion, PromotionalItem, PunchCard } from '../interface/DashboardData'
import { MicrosoftRewardsBot } from '../index'
import JobState from '../util/JobState'
import Retry from '../util/Retry'
import { AdaptiveThrottler } from '../util/AdaptiveThrottler'
export class Workers {
public bot: MicrosoftRewardsBot
private jobState: JobState
constructor(bot: MicrosoftRewardsBot) {
this.bot = bot
this.jobState = new JobState(this.bot.config)
}
// Daily Set
async doDailySet(page: Page, data: DashboardData) {
const todayData = data.dailySetPromotions[this.bot.utils.getFormattedDate()]
const activitiesUncompleted = todayData?.filter(x => !x.complete && x.pointProgressMax > 0) ?? []
const today = this.bot.utils.getFormattedDate()
const activitiesUncompleted = (todayData?.filter(x => !x.complete && x.pointProgressMax > 0) ?? [])
.filter(x => {
if (this.bot.config.jobState?.enabled === false) return true
const email = this.bot.currentAccountEmail || 'unknown'
return !this.jobState.isDone(email, today, x.offerId)
})
if (!activitiesUncompleted.length) {
this.bot.log(this.bot.isMobile, 'DAILY-SET', 'All Daily Set" items have already been completed')
@@ -27,12 +38,30 @@ export class Workers {
await this.solveActivities(page, activitiesUncompleted)
// Mark as done to prevent duplicate work if checkpoints enabled
if (this.bot.config.jobState?.enabled !== false) {
const email = this.bot.currentAccountEmail || 'unknown'
for (const a of activitiesUncompleted) {
this.jobState.markDone(email, today, a.offerId)
}
}
page = await this.bot.browser.utils.getLatestTab(page)
// Always return to the homepage if not already
await this.bot.browser.func.goHome(page)
this.bot.log(this.bot.isMobile, 'DAILY-SET', 'All "Daily Set" items have been completed')
// Optional: immediately run desktop search bundle
if (!this.bot.isMobile && this.bot.config.workers.bundleDailySetWithSearch && this.bot.config.workers.doDesktopSearch) {
try {
await this.bot.utils.waitRandom(1200, 2600)
await this.bot.activities.doSearch(page, data)
} catch (e) {
this.bot.log(this.bot.isMobile, 'DAILY-SET', `Post-DailySet search failed: ${e instanceof Error ? e.message : e}`, 'warn')
}
}
}
// Punch Card
@@ -120,7 +149,9 @@ export class Workers {
private async solveActivities(activityPage: Page, activities: PromotionalItem[] | MorePromotion[], punchCard?: PunchCard) {
const activityInitial = activityPage.url() // Homepage for Daily/More and Index for promotions
for (const activity of activities) {
const retry = new Retry(this.bot.config.retryPolicy)
const throttle = new AdaptiveThrottler()
for (const activity of activities) {
try {
// Reselect the worker page
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
@@ -132,7 +163,11 @@ export class Workers {
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
}
await this.bot.utils.wait(1000)
await this.bot.browser.utils.humanizePage(activityPage)
{
const m = throttle.getDelayMultiplier()
await this.bot.utils.waitRandom(Math.floor(800*m), Math.floor(1400*m))
}
if (activityPage.url() !== activityInitial) {
await activityPage.goto(activityInitial)
@@ -154,74 +189,50 @@ export class Workers {
if it didn't then it gave enough time for the page to load.
*/
await activityPage.waitForLoadState('networkidle', { timeout: 10000 }).catch(() => { })
await this.bot.utils.wait(2000)
switch (activity.promotionType) {
// Quiz (Poll, Quiz or ABC)
case 'quiz':
switch (activity.pointProgressMax) {
// Poll or ABC (Usually 10 points)
case 10:
// Normal poll
if (activity.destinationUrl.toLowerCase().includes('pollscenarioid')) {
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Found activity type: "Poll" title: "${activity.title}"`)
await activityPage.click(selector)
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
await this.bot.activities.doPoll(activityPage)
} else { // ABC
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Found activity type: "ABC" title: "${activity.title}"`)
await activityPage.click(selector)
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
await this.bot.activities.doABC(activityPage)
}
break
// This Or That Quiz (Usually 50 points)
case 50:
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Found activity type: "ThisOrThat" title: "${activity.title}"`)
await activityPage.click(selector)
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
await this.bot.activities.doThisOrThat(activityPage)
break
// Quizzes are usually 30-40 points
default:
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Found activity type: "Quiz" title: "${activity.title}"`)
await activityPage.click(selector)
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
await this.bot.activities.doQuiz(activityPage)
break
}
break
// UrlReward (Visit)
case 'urlreward':
// Search on Bing are subtypes of "urlreward"
if (activity.name.toLowerCase().includes('exploreonbing')) {
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Found activity type: "SearchOnBing" title: "${activity.title}"`)
await activityPage.click(selector)
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
await this.bot.activities.doSearchOnBing(activityPage, activity)
} else {
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Found activity type: "UrlReward" title: "${activity.title}"`)
await activityPage.click(selector)
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
await this.bot.activities.doUrlReward(activityPage)
}
break
// Unsupported types
default:
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Skipped activity "${activity.title}" | Reason: Unsupported type: "${activity.promotionType}"!`, 'warn')
break
// Small human-like jitter before executing
await this.bot.browser.utils.humanizePage(activityPage)
{
const m = throttle.getDelayMultiplier()
await this.bot.utils.waitRandom(Math.floor(1200*m), Math.floor(2600*m))
}
// Cooldown
await this.bot.utils.wait(2000)
// Log the detected type using the same heuristics as before
const typeLabel = this.bot.activities.getTypeLabel(activity)
if (typeLabel !== 'Unsupported') {
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Found activity type: "${typeLabel}" title: "${activity.title}"`)
await activityPage.click(selector)
activityPage = await this.bot.browser.utils.getLatestTab(activityPage)
// Watchdog: abort if the activity hangs too long
const timeoutMs = this.bot.utils.stringToMs(this.bot.config?.globalTimeout ?? '30s') * 2
const runWithTimeout = (p: Promise<void>) => Promise.race([
p,
new Promise<void>((_, rej) => setTimeout(() => rej(new Error('activity-timeout')), timeoutMs))
])
await retry.run(async () => {
try {
await runWithTimeout(this.bot.activities.run(activityPage, activity))
throttle.record(true)
} catch (e) {
await this.bot.browser.utils.captureDiagnostics(activityPage, `activity_timeout_${activity.title || activity.offerId}`)
throttle.record(false)
throw e
}
}, () => true)
} else {
this.bot.log(this.bot.isMobile, 'ACTIVITY', `Skipped activity "${activity.title}" | Reason: Unsupported type: "${activity.promotionType}"!`, 'warn')
}
// Cooldown with jitter
await this.bot.browser.utils.humanizePage(activityPage)
{
const m = throttle.getDelayMultiplier()
await this.bot.utils.waitRandom(Math.floor(1200*m), Math.floor(2600*m))
}
} catch (error) {
await this.bot.browser.utils.captureDiagnostics(activityPage, `activity_error_${activity.title || activity.offerId}`)
this.bot.log(this.bot.isMobile, 'ACTIVITY', 'An error occurred:' + error, 'error')
throttle.record(false)
}
}

View File

@@ -30,7 +30,7 @@ export class Quiz extends Workers {
for (let i = 0; i < quizData.numberOfOptions; i++) {
const answerSelector = await page.waitForSelector(`#rqAnswerOption${i}`, { state: 'visible', timeout: 10000 })
const answerAttribute = await answerSelector?.evaluate((el: any) => el.getAttribute('iscorrectoption'))
const answerAttribute = await answerSelector?.evaluate((el: Element) => el.getAttribute('iscorrectoption'))
if (answerAttribute && answerAttribute.toLowerCase() === 'true') {
answers.push(`#rqAnswerOption${i}`)
@@ -60,7 +60,7 @@ export class Quiz extends Workers {
for (let i = 0; i < quizData.numberOfOptions; i++) {
const answerSelector = await page.waitForSelector(`#rqAnswerOption${i}`, { state: 'visible', timeout: 10000 })
const dataOption = await answerSelector?.evaluate((el: any) => el.getAttribute('data-option'))
const dataOption = await answerSelector?.evaluate((el: Element) => el.getAttribute('data-option'))
if (dataOption === correctOption) {
// Click the answer on page
@@ -84,6 +84,7 @@ export class Quiz extends Workers {
this.bot.log(this.bot.isMobile, 'QUIZ', 'Completed the quiz successfully')
} catch (error) {
await this.bot.browser.utils.captureDiagnostics(page, 'quiz_error')
await page.close()
this.bot.log(this.bot.isMobile, 'QUIZ', 'An error occurred:' + error, 'error')
}

View File

@@ -33,12 +33,32 @@ export class Search extends Workers {
return
}
// Generate search queries
let googleSearchQueries = await this.getGoogleTrends(this.bot.config.searchSettings.useGeoLocaleQueries ? data.userProfile.attributes.country : 'US')
googleSearchQueries = this.bot.utils.shuffleArray(googleSearchQueries)
// Generate search queries (primary: Google Trends)
const geo = this.bot.config.searchSettings.useGeoLocaleQueries ? data.userProfile.attributes.country : 'US'
let googleSearchQueries = await this.getGoogleTrends(geo)
// Deduplicate the search terms
googleSearchQueries = [...new Set(googleSearchQueries)]
// Fallback: if trends failed or insufficient, sample from local queries file
if (!googleSearchQueries.length || googleSearchQueries.length < 10) {
this.bot.log(this.bot.isMobile, 'SEARCH-BING', 'Primary trends source insufficient, falling back to local queries.json', 'warn')
try {
const local = await import('../queries.json')
// Flatten & sample
const sampleSize = Math.max(5, Math.min(this.bot.config.searchSettings.localFallbackCount || 25, local.default.length))
const sampled = this.bot.utils.shuffleArray(local.default).slice(0, sampleSize)
googleSearchQueries = sampled.map((x: { title: string; queries: string[] }) => ({ topic: x.queries[0] || x.title, related: x.queries.slice(1) }))
} catch (e) {
this.bot.log(this.bot.isMobile, 'SEARCH-BING', 'Failed loading local queries fallback: ' + (e instanceof Error ? e.message : e), 'error')
}
}
googleSearchQueries = this.bot.utils.shuffleArray(googleSearchQueries)
// Deduplicate topics
const seen = new Set<string>()
googleSearchQueries = googleSearchQueries.filter(q => {
if (seen.has(q.topic.toLowerCase())) return false
seen.add(q.topic.toLowerCase())
return true
})
// Go to bing
await page.goto(this.searchPageURL ? this.searchPageURL : this.bingHome)
@@ -47,7 +67,7 @@ export class Search extends Workers {
await this.bot.browser.utils.tryDismissAllMessages(page)
let maxLoop = 0 // If the loop hits 10 this when not gaining any points, we're assuming it's stuck. If it doesn't continue after 5 more searches with alternative queries, abort search
let stagnation = 0 // consecutive searches without point progress
const queries: string[] = []
// Mobile search doesn't seem to like related queries?
@@ -63,28 +83,26 @@ export class Search extends Workers {
const newMissingPoints = this.calculatePoints(searchCounters)
// If the new point amount is the same as before
if (newMissingPoints == missingPoints) {
maxLoop++ // Add to max loop
} else { // There has been a change in points
maxLoop = 0 // Reset the loop
if (newMissingPoints === missingPoints) {
stagnation++
} else {
stagnation = 0
}
missingPoints = newMissingPoints
if (missingPoints === 0) {
break
}
if (missingPoints === 0) break
// Only for mobile searches
if (maxLoop > 5 && this.bot.isMobile) {
if (stagnation > 5 && this.bot.isMobile) {
this.bot.log(this.bot.isMobile, 'SEARCH-BING', 'Search didn\'t gain point for 5 iterations, likely bad User-Agent', 'warn')
break
}
// If we didn't gain points for 10 iterations, assume it's stuck
if (maxLoop > 10) {
if (stagnation > 10) {
this.bot.log(this.bot.isMobile, 'SEARCH-BING', 'Search didn\'t gain point for 10 iterations aborting searches', 'warn')
maxLoop = 0 // Reset to 0 so we can retry with related searches below
stagnation = 0 // allow fallback loop below
break
}
}
@@ -99,8 +117,11 @@ export class Search extends Workers {
this.bot.log(this.bot.isMobile, 'SEARCH-BING', `Search completed but we're missing ${missingPoints} points, generating extra searches`)
let i = 0
while (missingPoints > 0) {
let fallbackRounds = 0
const extraRetries = this.bot.config.searchSettings.extraFallbackRetries || 1
while (missingPoints > 0 && fallbackRounds <= extraRetries) {
const query = googleSearchQueries[i++] as GoogleSearch
if (!query) break
// Get related search terms to the Google search queries
const relatedTerms = await this.getRelatedTerms(query?.topic)
@@ -113,10 +134,10 @@ export class Search extends Workers {
const newMissingPoints = this.calculatePoints(searchCounters)
// If the new point amount is the same as before
if (newMissingPoints == missingPoints) {
maxLoop++ // Add to max loop
} else { // There has been a change in points
maxLoop = 0 // Reset the loop
if (newMissingPoints === missingPoints) {
stagnation++
} else {
stagnation = 0
}
missingPoints = newMissingPoints
@@ -127,11 +148,12 @@ export class Search extends Workers {
}
// Try 5 more times, then we tried a total of 15 times, fair to say it's stuck
if (maxLoop > 5) {
if (stagnation > 5) {
this.bot.log(this.bot.isMobile, 'SEARCH-BING-EXTRA', 'Search didn\'t gain point for 5 iterations aborting searches', 'warn')
return
}
}
fallbackRounds++
}
}
}
@@ -156,20 +178,38 @@ export class Search extends Workers {
await this.bot.utils.wait(500)
const searchBar = '#sb_form_q'
await searchPage.waitForSelector(searchBar, { state: 'visible', timeout: 10000 })
await searchPage.click(searchBar) // Focus on the textarea
await this.bot.utils.wait(500)
await searchPage.keyboard.down(platformControlKey)
await searchPage.keyboard.press('A')
await searchPage.keyboard.press('Backspace')
await searchPage.keyboard.up(platformControlKey)
await searchPage.keyboard.type(query)
await searchPage.keyboard.press('Enter')
// Prefer attached over visible to avoid strict visibility waits when overlays exist
const box = searchPage.locator(searchBar)
await box.waitFor({ state: 'attached', timeout: 15000 })
// Try dismissing overlays before interacting
await this.bot.browser.utils.tryDismissAllMessages(searchPage)
await this.bot.utils.wait(200)
let navigatedDirectly = false
try {
// Try focusing and filling instead of clicking (more reliable on mobile)
await box.focus({ timeout: 2000 }).catch(() => { /* ignore focus errors */ })
await box.fill('')
await this.bot.utils.wait(200)
await searchPage.keyboard.down(platformControlKey)
await searchPage.keyboard.press('A')
await searchPage.keyboard.press('Backspace')
await searchPage.keyboard.up(platformControlKey)
await box.type(query, { delay: 20 })
await searchPage.keyboard.press('Enter')
} catch (typeErr) {
// As a robust fallback, navigate directly to the search results URL
const q = encodeURIComponent(query)
const url = `https://www.bing.com/search?q=${q}`
await searchPage.goto(url)
navigatedDirectly = true
}
await this.bot.utils.wait(3000)
// Bing.com in Chrome opens a new tab when searching
const resultPage = await this.bot.browser.utils.getLatestTab(searchPage)
// Bing.com in Chrome opens a new tab when searching via Enter; if we navigated directly, stay on current tab
const resultPage = navigatedDirectly ? searchPage : await this.bot.browser.utils.getLatestTab(searchPage)
this.searchPageURL = new URL(resultPage.url()).href // Set the results page
await this.bot.browser.utils.reloadBadPage(resultPage)
@@ -185,7 +225,10 @@ export class Search extends Workers {
}
// Delay between searches
await this.bot.utils.wait(Math.floor(this.bot.utils.randomNumber(this.bot.utils.stringToMs(this.bot.config.searchSettings.searchDelay.min), this.bot.utils.stringToMs(this.bot.config.searchSettings.searchDelay.max))))
const minDelay = this.bot.utils.stringToMs(this.bot.config.searchSettings.searchDelay.min)
const maxDelay = this.bot.utils.stringToMs(this.bot.config.searchSettings.searchDelay.max)
const adaptivePad = Math.min(4000, Math.max(0, Math.floor(Math.random() * 800)))
await this.bot.utils.wait(Math.floor(this.bot.utils.randomNumber(minDelay, maxDelay)) + adaptivePad)
return await this.bot.browser.func.getSearchPoints()

View File

@@ -20,11 +20,20 @@ export class SearchOnBing extends Workers {
const query = await this.getSearchQuery(activity.title)
const searchBar = '#sb_form_q'
await page.waitForSelector(searchBar, { state: 'visible', timeout: 10000 })
await this.safeClick(page, searchBar)
await this.bot.utils.wait(500)
await page.keyboard.type(query)
await page.keyboard.press('Enter')
const box = page.locator(searchBar)
await box.waitFor({ state: 'attached', timeout: 15000 })
await this.bot.browser.utils.tryDismissAllMessages(page)
await this.bot.utils.wait(200)
try {
await box.focus({ timeout: 2000 }).catch(() => { /* ignore */ })
await box.fill('')
await this.bot.utils.wait(200)
await page.keyboard.type(query, { delay: 20 })
await page.keyboard.press('Enter')
} catch {
const url = `https://www.bing.com/search?q=${encodeURIComponent(query)}`
await page.goto(url)
}
await this.bot.utils.wait(3000)
await page.close()
@@ -36,22 +45,6 @@ export class SearchOnBing extends Workers {
}
}
private async safeClick(page: Page, selector: string) {
try {
await page.click(selector, { timeout: 5000 })
} catch (e: any) {
const msg = (e?.message || '')
if (/Timeout.*click/i.test(msg) || /intercepts pointer events/i.test(msg)) {
// Try to dismiss overlays then retry once
await this.bot.browser.utils.tryDismissAllMessages(page)
await this.bot.utils.wait(500)
await page.click(selector, { timeout: 5000 })
} else {
throw e
}
}
}
private async getSearchQuery(title: string): Promise<string> {
interface Queries {
title: string;

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,10 @@
export interface Account {
email: string;
password: string;
/** Optional TOTP secret in Base32 (e.g., from Microsoft Authenticator setup) */
totp?: string;
/** Optional recovery email used to verify masked address on Microsoft login screens */
recoveryEmail?: string;
proxy: AccountProxy;
}

View File

@@ -0,0 +1,21 @@
import type { MorePromotion, PromotionalItem } from './DashboardData'
import type { Page } from 'playwright'
/**
* Activity handler contract for solving a single dashboard activity.
* Implementations should be stateless (or hold only a reference to the bot)
* and perform all required steps on the provided page.
*/
export interface ActivityHandler {
/** Optional identifier for diagnostics */
id?: string
/**
* Return true if this handler knows how to process the given activity.
*/
canHandle(activity: MorePromotion | PromotionalItem): boolean
/**
* Execute the activity on the provided page. The page is already
* navigated to the activity tab/window by the caller.
*/
run(page: Page, activity: MorePromotion | PromotionalItem): Promise<void>
}

View File

@@ -10,11 +10,23 @@ export interface Config {
searchOnBingLocalQueries: boolean;
globalTimeout: number | string;
searchSettings: ConfigSearchSettings;
humanization?: ConfigHumanization; // Anti-ban humanization controls
retryPolicy?: ConfigRetryPolicy; // Global retry/backoff policy
jobState?: ConfigJobState; // Persistence of per-activity checkpoints
logExcludeFunc: string[];
webhookLogExcludeFunc: string[];
logging?: ConfigLogging; // Preserve original logging object (for live webhook settings)
proxy: ConfigProxy;
webhook: ConfigWebhook;
conclusionWebhook?: ConfigWebhook; // Optional secondary webhook for final summary
ntfy: ConfigNtfy;
diagnostics?: ConfigDiagnostics;
update?: ConfigUpdate;
schedule?: ConfigSchedule;
passesPerRun?: number;
buyMode?: ConfigBuyMode; // Optional manual spending mode
vacation?: ConfigVacation; // Optional monthly contiguous off-days
crashRecovery?: ConfigCrashRecovery; // Automatic restart / graceful shutdown
}
export interface ConfigSaveFingerprint {
@@ -28,6 +40,8 @@ export interface ConfigSearchSettings {
clickRandomResults: boolean;
searchDelay: ConfigSearchDelay;
retryMobileSearchAmount: number;
localFallbackCount?: number; // Number of local fallback queries to sample when trends fail
extraFallbackRetries?: number; // Additional mini-retry loops with fallback terms
}
export interface ConfigSearchDelay {
@@ -38,6 +52,15 @@ export interface ConfigSearchDelay {
export interface ConfigWebhook {
enabled: boolean;
url: string;
username?: string; // Optional override for displayed webhook name
avatarUrl?: string; // Optional avatar image URL
}
export interface ConfigNtfy {
enabled: boolean;
url: string;
topic: string;
authToken?: string; // Optional authentication token
}
export interface ConfigProxy {
@@ -45,6 +68,50 @@ export interface ConfigProxy {
proxyBingTerms: boolean;
}
export interface ConfigDiagnostics {
enabled?: boolean; // master toggle
saveScreenshot?: boolean; // capture .png
saveHtml?: boolean; // capture .html
maxPerRun?: number; // cap number of captures per run
retentionDays?: number; // delete older diagnostic folders
}
export interface ConfigUpdate {
git?: boolean; // if true, run git pull + npm ci + npm run build after completion
docker?: boolean; // if true, run docker update routine (compose pull/up) after completion
scriptPath?: string; // optional custom path to update script relative to repo root
}
export interface ConfigBuyMode {
enabled?: boolean; // if true, force buy mode session
maxMinutes?: number; // session duration cap
}
export interface ConfigSchedule {
enabled?: boolean;
time?: string; // Back-compat: accepts "HH:mm" or "h:mm AM/PM"
// New optional explicit times
time12?: string; // e.g., "9:00 AM"
time24?: string; // e.g., "09:00"
timeZone?: string; // IANA TZ e.g., "America/New_York"
useAmPm?: boolean; // If true, prefer time12 + AM/PM style; if false, prefer time24. If undefined, back-compat behavior.
runImmediatelyOnStart?: boolean; // if true, run once immediately when process starts
}
export interface ConfigVacation {
enabled?: boolean; // default false
minDays?: number; // default 3
maxDays?: number; // default 5
}
export interface ConfigCrashRecovery {
autoRestart?: boolean; // Restart the root process after fatal crash
maxRestarts?: number; // Max restart attempts (default 2)
backoffBaseMs?: number; // Base backoff before restart (default 2000)
restartFailedWorker?: boolean; // (future) attempt to respawn crashed worker
restartFailedWorkerAttempts?: number; // attempts per worker (default 1)
}
export interface ConfigWorkers {
doDailySet: boolean;
doMorePromotions: boolean;
@@ -53,4 +120,60 @@ export interface ConfigWorkers {
doMobileSearch: boolean;
doDailyCheckIn: boolean;
doReadToEarn: boolean;
bundleDailySetWithSearch?: boolean; // If true, run desktop search right after Daily Set
}
// Anti-ban humanization
export interface ConfigHumanization {
// Master toggle for Human Mode. When false, humanization is minimized.
enabled?: boolean;
// If true, stop processing remaining accounts after a ban is detected
stopOnBan?: boolean;
// If true, send an immediate webhook/NTFY alert when a ban is detected
immediateBanAlert?: boolean;
// Additional random waits between actions
actionDelay?: { min: number | string; max: number | string };
// Probability [0..1] to perform micro mouse moves per step
gestureMoveProb?: number;
// Probability [0..1] to perform tiny scrolls per step
gestureScrollProb?: number;
// Allowed execution windows (local time). Each item is "HH:mm-HH:mm".
// If provided, runs outside these windows will be delayed until the next allowed window.
allowedWindows?: string[];
// Randomly skip N days per week to look more human (0-7). Default 1.
randomOffDaysPerWeek?: number;
}
// Retry/backoff policy
export interface ConfigRetryPolicy {
maxAttempts?: number; // default 3
baseDelay?: number | string; // default 1000ms
maxDelay?: number | string; // default 30s
multiplier?: number; // default 2
jitter?: number; // 0..1; default 0.2
}
// Job state persistence
export interface ConfigJobState {
enabled?: boolean; // default true
dir?: string; // base directory; defaults to <sessionPath>/job-state
}
// Live logging configuration
export interface ConfigLoggingLive {
enabled?: boolean; // master switch for live webhook logs
redactEmails?: boolean; // if true, redact emails in outbound logs
}
export interface ConfigLogging {
excludeFunc?: string[];
webhookExcludeFunc?: string[];
live?: ConfigLoggingLive;
liveWebhookUrl?: string; // legacy/dedicated live webhook override
redactEmails?: boolean; // legacy top-level redaction flag
// Optional nested live.url support (already handled dynamically in Logger)
[key: string]: unknown; // forward compatibility
}
// CommunityHelp removed (privacy-first policy)

View File

@@ -1,155 +0,0 @@
#!/usr/bin/env bash
set -euo pipefail
export PLAYWRIGHT_BROWSERS_PATH=/ms-playwright
export TZ="${TZ:-UTC}"
cd /usr/src/microsoft-rewards-script
LOCKFILE=/tmp/run_daily.lock
# -------------------------------
# Function: Check and fix lockfile integrity
# -------------------------------
self_heal_lockfile() {
# If lockfile exists but is empty → remove it
if [ -f "$LOCKFILE" ]; then
local lock_content
lock_content=$(<"$LOCKFILE" || echo "")
if [[ -z "$lock_content" ]]; then
echo "[$(date)] [run_daily.sh] Found empty lockfile → removing."
rm -f "$LOCKFILE"
return
fi
# If lockfile contains non-numeric PID → remove it
if ! [[ "$lock_content" =~ ^[0-9]+$ ]]; then
echo "[$(date)] [run_daily.sh] Found corrupted lockfile content ('$lock_content') → removing."
rm -f "$LOCKFILE"
return
fi
# If lockfile contains PID but process is dead → remove it
if ! kill -0 "$lock_content" 2>/dev/null; then
echo "[$(date)] [run_daily.sh] Lockfile PID $lock_content is dead → removing stale lock."
rm -f "$LOCKFILE"
return
fi
fi
}
# -------------------------------
# Function: Acquire lock
# -------------------------------
acquire_lock() {
local max_attempts=5
local attempt=0
local timeout_hours=${STUCK_PROCESS_TIMEOUT_HOURS:-8}
local timeout_seconds=$((timeout_hours * 3600))
while [ $attempt -lt $max_attempts ]; do
# Try to create lock with current PID
if (set -C; echo "$$" > "$LOCKFILE") 2>/dev/null; then
echo "[$(date)] [run_daily.sh] Lock acquired successfully (PID: $$)"
return 0
fi
# Lock exists, validate it
if [ -f "$LOCKFILE" ]; then
local existing_pid
existing_pid=$(<"$LOCKFILE" || echo "")
echo "[$(date)] [run_daily.sh] Lock file exists with PID: '$existing_pid'"
# If lockfile content is invalid → delete and retry
if [[ -z "$existing_pid" || ! "$existing_pid" =~ ^[0-9]+$ ]]; then
echo "[$(date)] [run_daily.sh] Removing invalid lockfile → retrying..."
rm -f "$LOCKFILE"
continue
fi
# If process is dead → delete and retry
if ! kill -0 "$existing_pid" 2>/dev/null; then
echo "[$(date)] [run_daily.sh] Removing stale lock (dead PID: $existing_pid)"
rm -f "$LOCKFILE"
continue
fi
# Check process runtime → kill if exceeded timeout
local process_age
if process_age=$(ps -o etimes= -p "$existing_pid" 2>/dev/null | tr -d ' '); then
if [ "$process_age" -gt "$timeout_seconds" ]; then
echo "[$(date)] [run_daily.sh] Killing stuck process $existing_pid (${process_age}s > ${timeout_hours}h)"
kill -TERM "$existing_pid" 2>/dev/null || true
sleep 5
kill -KILL "$existing_pid" 2>/dev/null || true
rm -f "$LOCKFILE"
continue
fi
fi
fi
echo "[$(date)] [run_daily.sh] Lock held by PID $existing_pid, attempt $((attempt + 1))/$max_attempts"
sleep 2
((attempt++))
done
echo "[$(date)] [run_daily.sh] Could not acquire lock after $max_attempts attempts; exiting."
return 1
}
# -------------------------------
# Function: Release lock
# -------------------------------
release_lock() {
if [ -f "$LOCKFILE" ]; then
local lock_pid
lock_pid=$(<"$LOCKFILE")
if [ "$lock_pid" = "$$" ]; then
rm -f "$LOCKFILE"
echo "[$(date)] [run_daily.sh] Lock released (PID: $$)"
fi
fi
}
# Always release lock on exit — but only if we acquired it
trap 'release_lock' EXIT INT TERM
# -------------------------------
# MAIN EXECUTION FLOW
# -------------------------------
echo "[$(date)] [run_daily.sh] Current process PID: $$"
# Self-heal any broken or empty locks before proceeding
self_heal_lockfile
# Attempt to acquire the lock safely
if ! acquire_lock; then
exit 0
fi
# Random sleep between MIN and MAX to spread execution
MINWAIT=${MIN_SLEEP_MINUTES:-5}
MAXWAIT=${MAX_SLEEP_MINUTES:-50}
MINWAIT_SEC=$((MINWAIT*60))
MAXWAIT_SEC=$((MAXWAIT*60))
if [ "${SKIP_RANDOM_SLEEP:-false}" != "true" ]; then
SLEEPTIME=$(( MINWAIT_SEC + RANDOM % (MAXWAIT_SEC - MINWAIT_SEC) ))
echo "[$(date)] [run_daily.sh] Sleeping for $((SLEEPTIME/60)) minutes ($SLEEPTIME seconds)"
sleep "$SLEEPTIME"
else
echo "[$(date)] [run_daily.sh] Skipping random sleep"
fi
# Start the actual script
echo "[$(date)] [run_daily.sh] Starting script..."
if npm start; then
echo "[$(date)] [run_daily.sh] Script completed successfully."
else
echo "[$(date)] [run_daily.sh] ERROR: Script failed!" >&2
fi
echo "[$(date)] [run_daily.sh] Script finished"
# Lock is released automatically via trap

329
src/scheduler.ts Normal file
View File

@@ -0,0 +1,329 @@
import { DateTime, IANAZone } from 'luxon'
import { spawn } from 'child_process'
import fs from 'fs'
import path from 'path'
import { MicrosoftRewardsBot } from './index'
import { loadConfig } from './util/Load'
import { log } from './util/Logger'
import type { Config } from './interface/Config'
function resolveTimeParts(schedule: Config['schedule'] | undefined): { tz: string; hour: number; minute: number } {
const tz = (schedule?.timeZone && IANAZone.isValidZone(schedule.timeZone)) ? schedule.timeZone : 'UTC'
// Determine source string
let src = ''
if (typeof schedule?.useAmPm === 'boolean') {
if (schedule.useAmPm) src = (schedule.time12 || schedule.time || '').trim()
else src = (schedule.time24 || schedule.time || '').trim()
} else {
// Back-compat: prefer time if present; else time24 or time12
src = (schedule?.time || schedule?.time24 || schedule?.time12 || '').trim()
}
// Try to parse 24h first: HH:mm
const m24 = src.match(/^\s*(\d{1,2}):(\d{2})\s*$/i)
if (m24) {
const hh = Math.max(0, Math.min(23, parseInt(m24[1]!, 10)))
const mm = Math.max(0, Math.min(59, parseInt(m24[2]!, 10)))
return { tz, hour: hh, minute: mm }
}
// Parse 12h with AM/PM: h:mm AM or h AM
const m12 = src.match(/^\s*(\d{1,2})(?::(\d{2}))?\s*(AM|PM)\s*$/i)
if (m12) {
let hh = parseInt(m12[1]!, 10)
const mm = m12[2] ? parseInt(m12[2]!, 10) : 0
const ampm = m12[3]!.toUpperCase()
if (hh === 12) hh = 0
if (ampm === 'PM') hh += 12
hh = Math.max(0, Math.min(23, hh))
const m = Math.max(0, Math.min(59, mm))
return { tz, hour: hh, minute: m }
}
// Fallback: default 09:00
return { tz, hour: 9, minute: 0 }
}
function parseTargetToday(now: Date, schedule: Config['schedule'] | undefined) {
const { tz, hour, minute } = resolveTimeParts(schedule)
const dtn = DateTime.fromJSDate(now, { zone: tz })
return dtn.set({ hour, minute, second: 0, millisecond: 0 })
}
async function runOnePass(): Promise<void> {
const bot = new MicrosoftRewardsBot(false)
await bot.initialize()
await bot.run()
}
/**
* Run a single pass either in-process or as a child process (default),
* with a watchdog timeout to kill stuck runs.
*/
async function runOnePassWithWatchdog(): Promise<void> {
// Heartbeat-aware watchdog configuration
// If a child is actively updating its heartbeat file, we allow it to run beyond the legacy timeout.
// Defaults are generous to allow first-day passes to finish searches with delays.
const staleHeartbeatMin = Number(
process.env.SCHEDULER_STALE_HEARTBEAT_MINUTES || process.env.SCHEDULER_PASS_TIMEOUT_MINUTES || 30
)
const graceMin = Number(process.env.SCHEDULER_HEARTBEAT_GRACE_MINUTES || 15)
const hardcapMin = Number(process.env.SCHEDULER_PASS_HARDCAP_MINUTES || 480) // 8 hours
const checkEveryMs = 60_000 // check once per minute
// Fork per pass: safer because we can terminate a stuck child without killing the scheduler
const forkPerPass = String(process.env.SCHEDULER_FORK_PER_PASS || 'true').toLowerCase() !== 'false'
if (!forkPerPass) {
// In-process fallback (cannot forcefully stop if truly stuck)
await log('main', 'SCHEDULER', `Starting pass in-process (grace ${graceMin}m • stale ${staleHeartbeatMin}m • hardcap ${hardcapMin}m). Cannot force-kill if stuck.`)
// No true watchdog possible in-process; just run
await runOnePass()
return
}
// Child process execution
const indexJs = path.join(__dirname, 'index.js')
await log('main', 'SCHEDULER', `Spawning child for pass: ${process.execPath} ${indexJs}`)
// Prepare heartbeat file path and pass to child
const cfg = loadConfig() as Config
const baseDir = path.join(process.cwd(), cfg.sessionPath || 'sessions')
const hbFile = path.join(baseDir, `heartbeat_${Date.now()}.lock`)
try { fs.mkdirSync(baseDir, { recursive: true }) } catch { /* ignore */ }
await new Promise<void>((resolve) => {
const child = spawn(process.execPath, [indexJs], { stdio: 'inherit', env: { ...process.env, SCHEDULER_HEARTBEAT_FILE: hbFile } })
let finished = false
const startedAt = Date.now()
const killChild = async (signal: NodeJS.Signals) => {
try {
await log('main', 'SCHEDULER', `Sending ${signal} to stuck child PID ${child.pid}`,'warn')
child.kill(signal)
} catch { /* ignore */ }
}
const timer = setInterval(() => {
if (finished) return
const now = Date.now()
const runtimeMin = Math.floor((now - startedAt) / 60000)
// Hard cap: always terminate if exceeded
if (runtimeMin >= hardcapMin) {
log('main', 'SCHEDULER', `Pass exceeded hard cap of ${hardcapMin} minutes; terminating...`, 'warn')
void killChild('SIGTERM')
setTimeout(() => { try { child.kill('SIGKILL') } catch { /* ignore */ } }, 10_000)
return
}
// Before grace, don't judge
if (runtimeMin < graceMin) return
// Check heartbeat freshness
try {
const st = fs.statSync(hbFile)
const mtimeMs = st.mtimeMs
const ageMin = Math.floor((now - mtimeMs) / 60000)
if (ageMin >= staleHeartbeatMin) {
log('main', 'SCHEDULER', `Heartbeat stale for ${ageMin}m (>=${staleHeartbeatMin}m). Terminating child...`, 'warn')
void killChild('SIGTERM')
setTimeout(() => { try { child.kill('SIGKILL') } catch { /* ignore */ } }, 10_000)
}
} catch {
// If file missing after grace, consider stale
log('main', 'SCHEDULER', 'Heartbeat file missing after grace. Terminating child...', 'warn')
void killChild('SIGTERM')
setTimeout(() => { try { child.kill('SIGKILL') } catch { /* ignore */ } }, 10_000)
}
}, checkEveryMs)
child.on('exit', async (code, signal) => {
finished = true
clearInterval(timer)
// Cleanup heartbeat file
try { if (fs.existsSync(hbFile)) fs.unlinkSync(hbFile) } catch { /* ignore */ }
if (signal) {
await log('main', 'SCHEDULER', `Child exited due to signal: ${signal}`, 'warn')
} else if (code && code !== 0) {
await log('main', 'SCHEDULER', `Child exited with non-zero code: ${code}`, 'warn')
} else {
await log('main', 'SCHEDULER', 'Child pass completed successfully')
}
resolve()
})
child.on('error', async (err) => {
finished = true
clearInterval(timer)
try { if (fs.existsSync(hbFile)) fs.unlinkSync(hbFile) } catch { /* ignore */ }
await log('main', 'SCHEDULER', `Failed to spawn child: ${err instanceof Error ? err.message : String(err)}`, 'error')
resolve()
})
})
}
async function runPasses(passes: number): Promise<void> {
const n = Math.max(1, Math.floor(passes || 1))
for (let i = 1; i <= n; i++) {
await log('main', 'SCHEDULER', `Starting pass ${i}/${n}`)
const started = Date.now()
await runOnePassWithWatchdog()
const took = Date.now() - started
const sec = Math.max(1, Math.round(took / 1000))
await log('main', 'SCHEDULER', `Completed pass ${i}/${n}`)
await log('main', 'SCHEDULER', `Pass ${i} duration: ${sec}s`)
}
}
async function main() {
const cfg = loadConfig() as Config & { schedule?: { enabled?: boolean; time?: string; timeZone?: string; runImmediatelyOnStart?: boolean } }
const schedule = cfg.schedule || { enabled: false }
const passes = typeof cfg.passesPerRun === 'number' ? cfg.passesPerRun : 1
const offPerWeek = Math.max(0, Math.min(7, Number(cfg.humanization?.randomOffDaysPerWeek ?? 1)))
let offDays: number[] = [] // 1..7 ISO weekday
let offWeek: number | null = null
type VacRange = { start: string; end: string } | null
let vacMonth: string | null = null // 'yyyy-LL'
let vacRange: VacRange = null // ISO dates 'yyyy-LL-dd'
const refreshOffDays = async (now: { weekNumber: number }) => {
if (offPerWeek <= 0) { offDays = []; offWeek = null; return }
const week = now.weekNumber
if (offWeek === week && offDays.length) return
// choose distinct weekdays [1..7]
const pool = [1,2,3,4,5,6,7]
const chosen: number[] = []
for (let i=0;i<Math.min(offPerWeek,7);i++) {
const idx = Math.floor(Math.random()*pool.length)
chosen.push(pool[idx]!)
pool.splice(idx,1)
}
offDays = chosen.sort((a,b)=>a-b)
offWeek = week
await log('main','SCHEDULER',`Selected random off-days this week (ISO): ${offDays.join(', ')}`,'warn')
}
const chooseVacationRange = async (now: typeof DateTime.prototype) => {
// Only when enabled
if (!cfg.vacation?.enabled) { vacRange = null; vacMonth = null; return }
const monthKey = now.toFormat('yyyy-LL')
if (vacMonth === monthKey && vacRange) return
// Determine month days and choose contiguous block
const monthStart = now.startOf('month')
const monthEnd = now.endOf('month')
const totalDays = monthEnd.day
const minD = Math.max(1, Math.min(28, Number(cfg.vacation.minDays ?? 3)))
const maxD = Math.max(minD, Math.min(31, Number(cfg.vacation.maxDays ?? 5)))
const span = (minD === maxD) ? minD : (minD + Math.floor(Math.random() * (maxD - minD + 1)))
const latestStart = Math.max(1, totalDays - span + 1)
const startDay = 1 + Math.floor(Math.random() * latestStart)
const start = monthStart.set({ day: startDay })
const end = start.plus({ days: span - 1 })
vacMonth = monthKey
vacRange = { start: start.toFormat('yyyy-LL-dd'), end: end.toFormat('yyyy-LL-dd') }
await log('main','SCHEDULER',`Selected vacation block this month: ${vacRange.start}${vacRange.end} (${span} day(s))`,'warn')
}
if (!schedule.enabled) {
await log('main', 'SCHEDULER', 'Schedule disabled; running once then exit')
await runPasses(passes)
process.exit(0)
}
const tz = (schedule.timeZone && IANAZone.isValidZone(schedule.timeZone)) ? schedule.timeZone : 'UTC'
// Default to false to avoid unexpected immediate runs
const runImmediate = schedule.runImmediatelyOnStart === true
let running = false
// Optional initial jitter before the first run (to vary start time)
const initJitterMin = Number(process.env.SCHEDULER_INITIAL_JITTER_MINUTES_MIN || process.env.SCHEDULER_INITIAL_JITTER_MIN || 0)
const initJitterMax = Number(process.env.SCHEDULER_INITIAL_JITTER_MINUTES_MAX || process.env.SCHEDULER_INITIAL_JITTER_MAX || 0)
const initialJitterBounds: [number, number] = [isFinite(initJitterMin) ? initJitterMin : 0, isFinite(initJitterMax) ? initJitterMax : 0]
const applyInitialJitter = (initialJitterBounds[0] > 0 || initialJitterBounds[1] > 0)
if (runImmediate && !running) {
running = true
if (applyInitialJitter) {
const min = Math.max(0, Math.min(initialJitterBounds[0], initialJitterBounds[1]))
const max = Math.max(0, Math.max(initialJitterBounds[0], initialJitterBounds[1]))
const jitterSec = (min === max) ? min * 60 : (min * 60 + Math.floor(Math.random() * ((max - min) * 60)))
if (jitterSec > 0) {
await log('main', 'SCHEDULER', `Initial jitter: delaying first run by ${Math.round(jitterSec / 60)} minute(s) (${jitterSec}s)`, 'warn')
await new Promise((r) => setTimeout(r, jitterSec * 1000))
}
}
const nowDT = DateTime.local().setZone(tz)
await chooseVacationRange(nowDT)
await refreshOffDays(nowDT)
const todayIso = nowDT.toFormat('yyyy-LL-dd')
const vr = vacRange as { start: string; end: string } | null
const isVacationToday = !!(vr && todayIso >= vr.start && todayIso <= vr.end)
if (isVacationToday) {
await log('main','SCHEDULER',`Skipping immediate run: vacation day (${todayIso})`,'warn')
} else if (offDays.includes(nowDT.weekday)) {
await log('main','SCHEDULER',`Skipping immediate run: off-day (weekday ${nowDT.weekday})`,'warn')
} else {
await runPasses(passes)
}
running = false
}
for (;;) {
const now = new Date()
const targetToday = parseTargetToday(now, schedule)
let next = targetToday
const nowDT = DateTime.fromJSDate(now, { zone: targetToday.zone })
if (nowDT >= targetToday) {
next = targetToday.plus({ days: 1 })
}
let ms = Math.max(0, next.toMillis() - nowDT.toMillis())
// Optional daily jitter to further randomize the exact start time each day
const dailyJitterMin = Number(process.env.SCHEDULER_DAILY_JITTER_MINUTES_MIN || process.env.SCHEDULER_DAILY_JITTER_MIN || 0)
const dailyJitterMax = Number(process.env.SCHEDULER_DAILY_JITTER_MINUTES_MAX || process.env.SCHEDULER_DAILY_JITTER_MAX || 0)
const djMin = isFinite(dailyJitterMin) ? dailyJitterMin : 0
const djMax = isFinite(dailyJitterMax) ? dailyJitterMax : 0
let extraMs = 0
if (djMin > 0 || djMax > 0) {
const mn = Math.max(0, Math.min(djMin, djMax))
const mx = Math.max(0, Math.max(djMin, djMax))
const jitterSec = (mn === mx) ? mn * 60 : (mn * 60 + Math.floor(Math.random() * ((mx - mn) * 60)))
extraMs = jitterSec * 1000
ms += extraMs
}
const human = next.toFormat('yyyy-LL-dd HH:mm ZZZZ')
const totalSec = Math.round(ms / 1000)
if (extraMs > 0) {
await log('main', 'SCHEDULER', `Next run at ${human} plus daily jitter (+${Math.round(extraMs/60000)}m) → in ${totalSec}s`)
} else {
await log('main', 'SCHEDULER', `Next run at ${human} (in ${totalSec}s)`)
}
await new Promise((resolve) => setTimeout(resolve, ms))
const nowRun = DateTime.local().setZone(tz)
await chooseVacationRange(nowRun)
await refreshOffDays(nowRun)
const todayIso2 = nowRun.toFormat('yyyy-LL-dd')
const vr2 = vacRange as { start: string; end: string } | null
const isVacation = !!(vr2 && todayIso2 >= vr2.start && todayIso2 <= vr2.end)
if (isVacation) {
await log('main','SCHEDULER',`Skipping scheduled run: vacation day (${todayIso2})`,'warn')
continue
}
if (offDays.includes(nowRun.weekday)) {
await log('main','SCHEDULER',`Skipping scheduled run: off-day (weekday ${nowRun.weekday})`,'warn')
continue
}
if (!running) {
running = true
await runPasses(passes)
running = false
} else {
await log('main','SCHEDULER','Skipped scheduled trigger because a pass is already running','warn')
}
}
}
main().catch((e) => {
console.error(e)
process.exit(1)
})

7
src/types/luxon.d.ts vendored Normal file
View File

@@ -0,0 +1,7 @@
/* Minimal ambient declarations to unblock TypeScript when @types/luxon not present. */
declare module 'luxon' {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
export const DateTime: any
// eslint-disable-next-line @typescript-eslint/no-explicit-any
export const IANAZone: any
}

View File

@@ -0,0 +1,25 @@
export class AdaptiveThrottler {
private errorCount = 0
private successCount = 0
private window: Array<{ ok: boolean; at: number }> = []
private readonly maxWindow = 50
record(ok: boolean) {
this.window.push({ ok, at: Date.now() })
if (ok) this.successCount++
else this.errorCount++
if (this.window.length > this.maxWindow) {
const removed = this.window.shift()
if (removed) removed.ok ? this.successCount-- : this.errorCount--
}
}
/** Return a multiplier to apply to waits (1 = normal). */
getDelayMultiplier(): number {
const total = Math.max(1, this.successCount + this.errorCount)
const errRatio = this.errorCount / total
// 0% errors -> 1x; 50% errors -> ~1.8x; 80% -> ~2.5x (cap)
const mult = 1 + Math.min(1.5, errRatio * 2)
return Number(mult.toFixed(2))
}
}

View File

@@ -42,7 +42,21 @@ class AxiosClient {
return bypassInstance.request(config)
}
return this.instance.request(config)
try {
return await this.instance.request(config)
} catch (err: unknown) {
// If proxied request fails with common proxy/network errors, retry once without proxy
const e = err as { code?: string; cause?: { code?: string }; message?: string } | undefined
const code = e?.code || e?.cause?.code
const isNetErr = code === 'ECONNREFUSED' || code === 'ETIMEDOUT' || code === 'ECONNRESET' || code === 'ENOTFOUND'
const msg = String(e?.message || '')
const looksLikeProxyIssue = /proxy|tunnel|socks|agent/i.test(msg)
if (!bypassProxy && (isNetErr || looksLikeProxyIssue)) {
const bypassInstance = axios.create()
return bypassInstance.request(config)
}
throw err
}
}
}

16
src/util/BanDetector.ts Normal file
View File

@@ -0,0 +1,16 @@
export type BanStatus = { status: boolean; reason: string }
const BAN_PATTERNS: Array<{ re: RegExp; reason: string }> = [
{ re: /suspend|suspended|suspension/i, reason: 'account suspended' },
{ re: /locked|lockout|serviceabuse|abuse/i, reason: 'locked or service abuse detected' },
{ re: /unusual.*activity|unusual activity/i, reason: 'unusual activity prompts' },
{ re: /verify.*identity|identity.*verification/i, reason: 'identity verification required' }
]
export function detectBanReason(input: unknown): BanStatus {
const s = input instanceof Error ? (input.message || '') : String(input || '')
for (const p of BAN_PATTERNS) {
if (p.re.test(s)) return { status: true, reason: p.reason }
}
return { status: false, reason: '' }
}

View File

@@ -1,32 +1,109 @@
import axios from 'axios'
import { Config } from '../interface/Config'
import { Ntfy } from './Ntfy'
// Light obfuscation of the avatar URL (base64). Prevents casual editing in config.
const AVATAR_B64 = 'aHR0cHM6Ly9tZWRpYS5kaXNjb3JkYXBwLm5ldC9hdHRhY2htZW50cy8xNDIxMTYzOTUyOTcyMzY5OTMxLzE0MjExNjQxNDU5OTQyNDAxMTAvbXNuLnBuZz93aWR0aD01MTImZWlnaHQ9NTEy'
function getAvatarUrl(): string {
try { return Buffer.from(AVATAR_B64, 'base64').toString('utf-8') } catch { return '' }
}
type WebhookContext = 'summary' | 'ban' | 'security' | 'compromised' | 'spend' | 'error' | 'default'
function pickUsername(ctx: WebhookContext, fallbackColor?: number): string {
switch (ctx) {
case 'summary': return 'Summary'
case 'ban': return 'Ban'
case 'security': return 'Security'
case 'compromised': return 'Pirate'
case 'spend': return 'Spend'
case 'error': return 'Error'
default: return fallbackColor === 0xFF0000 ? 'Error' : 'Rewards'
}
}
interface DiscordField { name: string; value: string; inline?: boolean }
interface DiscordEmbed {
title?: string
description?: string
color?: number
fields?: DiscordField[]
}
interface ConclusionPayload {
content?: string
embeds?: any[]
embeds?: DiscordEmbed[]
context?: WebhookContext
}
/**
* Send a final structured summary to the dedicated conclusion webhook (if enabled),
* otherwise do nothing. Does NOT fallback to the normal logging webhook to avoid spam.
* Send a final structured summary to the configured webhook,
* and optionally mirror a plain-text summary to NTFY.
*
* This preserves existing webhook behavior while adding NTFY
* as a separate, optional channel.
*/
export async function ConclusionWebhook(configData: Config, content: string, embed?: ConclusionPayload) {
const webhook = configData.conclusionWebhook
export async function ConclusionWebhook(config: Config, content: string, payload?: ConclusionPayload) {
// Send to both webhooks when available
const hasConclusion = !!(config.conclusionWebhook?.enabled && config.conclusionWebhook.url)
const hasWebhook = !!(config.webhook?.enabled && config.webhook.url)
const sameTarget = hasConclusion && hasWebhook && config.conclusionWebhook!.url === config.webhook!.url
if (!webhook || !webhook.enabled || webhook.url.length < 10) return
const body: ConclusionPayload & { username?: string; avatar_url?: string } = {}
if (payload?.embeds) body.embeds = payload.embeds
if (content && content.trim()) body.content = content
const firstColor = payload?.embeds && payload.embeds[0]?.color
const ctx: WebhookContext = payload?.context || (firstColor === 0xFF0000 ? 'error' : 'default')
body.username = pickUsername(ctx, firstColor)
body.avatar_url = getAvatarUrl()
const body: ConclusionPayload = embed?.embeds ? { embeds: embed.embeds } : { content }
if (content && !body.content && !body.embeds) body.content = content
const request = {
method: 'POST',
url: webhook.url,
headers: {
'Content-Type': 'application/json'
},
data: body
// Post to conclusion webhook if configured
const postWithRetry = async (url: string, label: string) => {
const max = 2
let lastErr: unknown = null
for (let attempt = 1; attempt <= max; attempt++) {
try {
await axios.post(url, body, { headers: { 'Content-Type': 'application/json' }, timeout: 15000 })
console.log(`[Webhook:${label}] summary sent (attempt ${attempt}).`)
return
} catch (e) {
lastErr = e
if (attempt === max) break
await new Promise(r => setTimeout(r, 1000 * attempt))
}
}
console.error(`[Webhook:${label}] failed after ${max} attempts:`, lastErr)
}
await axios(request).catch(() => { })
if (hasConclusion) {
await postWithRetry(config.conclusionWebhook!.url, sameTarget ? 'conclusion+primary' : 'conclusion')
}
if (hasWebhook && !sameTarget) {
await postWithRetry(config.webhook!.url, 'primary')
}
// NTFY: mirror a plain text summary (optional)
if (config.ntfy?.enabled && config.ntfy.url && config.ntfy.topic) {
let message = content || ''
if (!message && payload?.embeds && payload.embeds.length > 0) {
const e: DiscordEmbed = payload.embeds[0]!
const title = e.title ? `${e.title}\n` : ''
const desc = e.description ? `${e.description}\n` : ''
const totals = e.fields && e.fields[0]?.value ? `\n${e.fields[0].value}\n` : ''
message = `${title}${desc}${totals}`.trim()
}
if (!message) message = 'Microsoft Rewards run complete.'
// Choose NTFY level based on embed color (yellow = warn)
let embedColor: number | undefined
if (payload?.embeds && payload.embeds.length > 0) {
embedColor = payload.embeds[0]!.color
}
const ntfyType = embedColor === 0xFFAA00 ? 'warn' : 'log'
try {
await Ntfy(message, ntfyType)
console.log('Conclusion summary sent to NTFY.')
} catch (err) {
console.error('Failed to send conclusion summary to NTFY:', err)
}
}
}

54
src/util/Humanizer.ts Normal file
View File

@@ -0,0 +1,54 @@
import { Page } from 'rebrowser-playwright'
import Util from './Utils'
import type { ConfigHumanization } from '../interface/Config'
export class Humanizer {
private util: Util
private cfg: ConfigHumanization | undefined
constructor(util: Util, cfg?: ConfigHumanization) {
this.util = util
this.cfg = cfg
}
async microGestures(page: Page): Promise<void> {
if (this.cfg && this.cfg.enabled === false) return
const moveProb = this.cfg?.gestureMoveProb ?? 0.4
const scrollProb = this.cfg?.gestureScrollProb ?? 0.2
try {
if (Math.random() < moveProb) {
const x = Math.floor(Math.random() * 40) + 5
const y = Math.floor(Math.random() * 30) + 5
await page.mouse.move(x, y, { steps: 2 }).catch(() => {})
}
if (Math.random() < scrollProb) {
const dy = (Math.random() < 0.5 ? 1 : -1) * (Math.floor(Math.random() * 150) + 50)
await page.mouse.wheel(0, dy).catch(() => {})
}
} catch {/* noop */}
}
async actionPause(): Promise<void> {
if (this.cfg && this.cfg.enabled === false) return
const defMin = 150
const defMax = 450
let min = defMin
let max = defMax
if (this.cfg?.actionDelay) {
const parse = (v: number | string) => {
if (typeof v === 'number') return v
try {
const n = this.util.stringToMs(String(v))
return Math.max(0, Math.min(n, 10_000))
} catch { return defMin }
}
min = parse(this.cfg.actionDelay.min)
max = parse(this.cfg.actionDelay.max)
if (min > max) [min, max] = [max, min]
max = Math.min(max, 5_000)
}
await this.util.wait(this.util.randomNumber(min, max))
}
}
export default Humanizer

58
src/util/JobState.ts Normal file
View File

@@ -0,0 +1,58 @@
import fs from 'fs'
import path from 'path'
import type { Config } from '../interface/Config'
type DayState = {
doneOfferIds: string[]
}
type FileState = {
days: Record<string, DayState>
}
export class JobState {
private baseDir: string
constructor(cfg: Config) {
const dir = cfg.jobState?.dir || path.join(process.cwd(), cfg.sessionPath, 'job-state')
this.baseDir = dir
if (!fs.existsSync(this.baseDir)) fs.mkdirSync(this.baseDir, { recursive: true })
}
private fileFor(email: string): string {
const safe = email.replace(/[^a-z0-9._-]/gi, '_')
return path.join(this.baseDir, `${safe}.json`)
}
private load(email: string): FileState {
const file = this.fileFor(email)
if (!fs.existsSync(file)) return { days: {} }
try {
const raw = fs.readFileSync(file, 'utf-8')
const parsed = JSON.parse(raw)
return parsed && typeof parsed === 'object' && parsed.days ? parsed as FileState : { days: {} }
} catch { return { days: {} } }
}
private save(email: string, state: FileState): void {
const file = this.fileFor(email)
fs.writeFileSync(file, JSON.stringify(state, null, 2), 'utf-8')
}
isDone(email: string, day: string, offerId: string): boolean {
const st = this.load(email)
const d = st.days[day]
if (!d) return false
return d.doneOfferIds.includes(offerId)
}
markDone(email: string, day: string, offerId: string): void {
const st = this.load(email)
if (!st.days[day]) st.days[day] = { doneOfferIds: [] }
const d = st.days[day]
if (!d.doneOfferIds.includes(offerId)) d.doneOfferIds.push(offerId)
this.save(email, st)
}
}
export default JobState

View File

@@ -8,38 +8,274 @@ import { Account } from '../interface/Account'
import { Config, ConfigSaveFingerprint } from '../interface/Config'
let configCache: Config
let configSourcePath = ''
// Basic JSON comment stripper (supports // line and /* block */ comments while preserving strings)
function stripJsonComments(input: string): string {
let out = ''
let inString = false
let stringChar = ''
let inLine = false
let inBlock = false
for (let i = 0; i < input.length; i++) {
const ch = input[i]!
const next = input[i + 1]
if (inLine) {
if (ch === '\n' || ch === '\r') {
inLine = false
out += ch
}
continue
}
if (inBlock) {
if (ch === '*' && next === '/') {
inBlock = false
i++
}
continue
}
if (inString) {
out += ch
if (ch === '\\') { // escape next char
i++
if (i < input.length) out += input[i]
continue
}
if (ch === stringChar) {
inString = false
}
continue
}
if (ch === '"' || ch === '\'') {
inString = true
stringChar = ch
out += ch
continue
}
if (ch === '/' && next === '/') {
inLine = true
i++
continue
}
if (ch === '/' && next === '*') {
inBlock = true
i++
continue
}
out += ch
}
return out
}
// Normalize both legacy (flat) and new (nested) config schemas into the flat Config interface
function normalizeConfig(raw: unknown): Config {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const n: any = (raw as any) || {}
// Browser / execution
const headless = n.browser?.headless ?? n.headless ?? false
const globalTimeout = n.browser?.globalTimeout ?? n.globalTimeout ?? '30s'
const parallel = n.execution?.parallel ?? n.parallel ?? false
const runOnZeroPoints = n.execution?.runOnZeroPoints ?? n.runOnZeroPoints ?? false
const clusters = n.execution?.clusters ?? n.clusters ?? 1
const passesPerRun = n.execution?.passesPerRun ?? n.passesPerRun
// Search
const useLocalQueries = n.search?.useLocalQueries ?? n.searchOnBingLocalQueries ?? false
const searchSettingsSrc = n.search?.settings ?? n.searchSettings ?? {}
const delaySrc = searchSettingsSrc.delay ?? searchSettingsSrc.searchDelay ?? { min: '3min', max: '5min' }
const searchSettings = {
useGeoLocaleQueries: !!(searchSettingsSrc.useGeoLocaleQueries ?? false),
scrollRandomResults: !!(searchSettingsSrc.scrollRandomResults ?? false),
clickRandomResults: !!(searchSettingsSrc.clickRandomResults ?? false),
retryMobileSearchAmount: Number(searchSettingsSrc.retryMobileSearchAmount ?? 2),
searchDelay: {
min: delaySrc.min ?? '3min',
max: delaySrc.max ?? '5min'
},
localFallbackCount: Number(searchSettingsSrc.localFallbackCount ?? 25),
extraFallbackRetries: Number(searchSettingsSrc.extraFallbackRetries ?? 1)
}
// Workers
const workers = n.workers ?? {
doDailySet: true,
doMorePromotions: true,
doPunchCards: true,
doDesktopSearch: true,
doMobileSearch: true,
doDailyCheckIn: true,
doReadToEarn: true,
bundleDailySetWithSearch: false
}
// Ensure missing flag gets a default
if (typeof workers.bundleDailySetWithSearch !== 'boolean') workers.bundleDailySetWithSearch = false
// Logging
const logging = n.logging ?? {}
const logExcludeFunc = Array.isArray(logging.excludeFunc) ? logging.excludeFunc : (n.logExcludeFunc ?? [])
const webhookLogExcludeFunc = Array.isArray(logging.webhookExcludeFunc) ? logging.webhookExcludeFunc : (n.webhookLogExcludeFunc ?? [])
// Notifications
const notifications = n.notifications ?? {}
const webhook = notifications.webhook ?? n.webhook ?? { enabled: false, url: '' }
const conclusionWebhook = notifications.conclusionWebhook ?? n.conclusionWebhook ?? { enabled: false, url: '' }
const ntfy = notifications.ntfy ?? n.ntfy ?? { enabled: false, url: '', topic: '', authToken: '' }
// Buy Mode
const buyMode = n.buyMode ?? {}
const buyModeEnabled = typeof buyMode.enabled === 'boolean' ? buyMode.enabled : false
const buyModeMax = typeof buyMode.maxMinutes === 'number' ? buyMode.maxMinutes : 45
// Fingerprinting
const saveFingerprint = (n.fingerprinting?.saveFingerprint ?? n.saveFingerprint) ?? { mobile: false, desktop: false }
// Humanization defaults (single on/off)
if (!n.humanization) n.humanization = {}
if (typeof n.humanization.enabled !== 'boolean') n.humanization.enabled = true
if (typeof n.humanization.stopOnBan !== 'boolean') n.humanization.stopOnBan = false
if (typeof n.humanization.immediateBanAlert !== 'boolean') n.humanization.immediateBanAlert = true
if (typeof n.humanization.randomOffDaysPerWeek !== 'number') {
n.humanization.randomOffDaysPerWeek = 1
}
// Strong default gestures when enabled (explicit values still win)
if (typeof n.humanization.gestureMoveProb !== 'number') {
n.humanization.gestureMoveProb = n.humanization.enabled === false ? 0 : 0.5
}
if (typeof n.humanization.gestureScrollProb !== 'number') {
n.humanization.gestureScrollProb = n.humanization.enabled === false ? 0 : 0.25
}
// Vacation mode (monthly contiguous off-days)
if (!n.vacation) n.vacation = {}
if (typeof n.vacation.enabled !== 'boolean') n.vacation.enabled = false
const vMin = Number(n.vacation.minDays)
const vMax = Number(n.vacation.maxDays)
n.vacation.minDays = isFinite(vMin) && vMin > 0 ? Math.floor(vMin) : 3
n.vacation.maxDays = isFinite(vMax) && vMax > 0 ? Math.floor(vMax) : 5
if (n.vacation.maxDays < n.vacation.minDays) {
const t = n.vacation.minDays; n.vacation.minDays = n.vacation.maxDays; n.vacation.maxDays = t
}
const cfg: Config = {
baseURL: n.baseURL ?? 'https://rewards.bing.com',
sessionPath: n.sessionPath ?? 'sessions',
headless,
parallel,
runOnZeroPoints,
clusters,
saveFingerprint,
workers,
searchOnBingLocalQueries: !!useLocalQueries,
globalTimeout,
searchSettings,
humanization: n.humanization,
retryPolicy: n.retryPolicy,
jobState: n.jobState,
logExcludeFunc,
webhookLogExcludeFunc,
logging, // retain full logging object for live webhook usage
proxy: n.proxy ?? { proxyGoogleTrends: true, proxyBingTerms: true },
webhook,
conclusionWebhook,
ntfy,
diagnostics: n.diagnostics,
update: n.update,
schedule: n.schedule,
passesPerRun: passesPerRun,
vacation: n.vacation,
buyMode: { enabled: buyModeEnabled, maxMinutes: buyModeMax },
crashRecovery: n.crashRecovery || {}
}
return cfg
}
export function loadAccounts(): Account[] {
try {
// 1) CLI dev override
let file = 'accounts.json'
// If dev mode, use dev account(s)
if (process.argv.includes('-dev')) {
file = 'accounts.dev.json'
}
const accountDir = path.join(__dirname, '../', file)
const accounts = fs.readFileSync(accountDir, 'utf-8')
// 2) Docker-friendly env overrides
const envJson = process.env.ACCOUNTS_JSON
const envFile = process.env.ACCOUNTS_FILE
return JSON.parse(accounts)
let raw: string | undefined
if (envJson && envJson.trim().startsWith('[')) {
raw = envJson
} else if (envFile && envFile.trim()) {
const full = path.isAbsolute(envFile) ? envFile : path.join(process.cwd(), envFile)
if (!fs.existsSync(full)) {
throw new Error(`ACCOUNTS_FILE not found: ${full}`)
}
raw = fs.readFileSync(full, 'utf-8')
} else {
// Try multiple locations to support both root mounts and dist mounts
const candidates = [
path.join(__dirname, '../', file), // root/accounts.json (preferred)
path.join(__dirname, '../src', file), // fallback: file kept inside src/
path.join(process.cwd(), file), // cwd override
path.join(process.cwd(), 'src', file), // cwd/src/accounts.json
path.join(__dirname, file) // dist/accounts.json (legacy)
]
let chosen: string | null = null
for (const p of candidates) {
try { if (fs.existsSync(p)) { chosen = p; break } } catch { /* ignore */ }
}
if (!chosen) throw new Error(`accounts file not found in: ${candidates.join(' | ')}`)
raw = fs.readFileSync(chosen, 'utf-8')
}
// Support comments in accounts file (same as config)
const cleaned = stripJsonComments(raw)
const parsedUnknown = JSON.parse(cleaned)
// Accept either a root array or an object with an `accounts` array, ignore `_note`
const parsed = Array.isArray(parsedUnknown) ? parsedUnknown : (parsedUnknown && typeof parsedUnknown === 'object' && Array.isArray((parsedUnknown as { accounts?: unknown }).accounts) ? (parsedUnknown as { accounts: unknown[] }).accounts : null)
if (!Array.isArray(parsed)) throw new Error('accounts must be an array')
// minimal shape validation
for (const a of parsed) {
if (!a || typeof a.email !== 'string' || typeof a.password !== 'string') {
throw new Error('each account must have email and password strings')
}
}
return parsed as Account[]
} catch (error) {
throw new Error(error as string)
}
}
export function getConfigPath(): string { return configSourcePath }
export function loadConfig(): Config {
try {
if (configCache) {
return configCache
}
const configDir = path.join(__dirname, '../', 'config.json')
const config = fs.readFileSync(configDir, 'utf-8')
// Resolve config.json from common locations
const candidates = [
path.join(__dirname, '../', 'config.json'), // root/config.json when compiled (expected primary)
path.join(__dirname, '../src', 'config.json'), // fallback: running compiled dist but file still in src/
path.join(process.cwd(), 'config.json'), // cwd root
path.join(process.cwd(), 'src', 'config.json'), // running from repo root but config left in src/
path.join(__dirname, 'config.json') // last resort: dist/util/config.json
]
let cfgPath: string | null = null
for (const p of candidates) {
try { if (fs.existsSync(p)) { cfgPath = p; break } } catch { /* ignore */ }
}
if (!cfgPath) throw new Error(`config.json not found in: ${candidates.join(' | ')}`)
const config = fs.readFileSync(cfgPath, 'utf-8')
const text = config.replace(/^\uFEFF/, '')
const raw = JSON.parse(stripJsonComments(text))
const normalized = normalizeConfig(raw)
configCache = normalized // Set as cache
configSourcePath = cfgPath
const configData = JSON.parse(config)
configCache = configData // Set as cache
return configData
return normalized
} catch (error) {
throw new Error(error as string)
}
@@ -56,13 +292,19 @@ export async function loadSessionData(sessionPath: string, email: string, isMobi
cookies = JSON.parse(cookiesData)
}
// Fetch fingerprint file
const fingerprintFile = path.join(__dirname, '../browser/', sessionPath, email, `${isMobile ? 'mobile_fingerpint' : 'desktop_fingerpint'}.json`)
// Fetch fingerprint file (support both legacy typo "fingerpint" and corrected "fingerprint")
const baseDir = path.join(__dirname, '../browser/', sessionPath, email)
const legacyFile = path.join(baseDir, `${isMobile ? 'mobile_fingerpint' : 'desktop_fingerpint'}.json`)
const correctFile = path.join(baseDir, `${isMobile ? 'mobile_fingerprint' : 'desktop_fingerprint'}.json`)
let fingerprint!: BrowserFingerprintWithHeaders
if (((saveFingerprint.desktop && !isMobile) || (saveFingerprint.mobile && isMobile)) && fs.existsSync(fingerprintFile)) {
const fingerprintData = await fs.promises.readFile(fingerprintFile, 'utf-8')
fingerprint = JSON.parse(fingerprintData)
const shouldLoad = (saveFingerprint.desktop && !isMobile) || (saveFingerprint.mobile && isMobile)
if (shouldLoad) {
const chosen = fs.existsSync(correctFile) ? correctFile : (fs.existsSync(legacyFile) ? legacyFile : '')
if (chosen) {
const fingerprintData = await fs.promises.readFile(chosen, 'utf-8')
fingerprint = JSON.parse(fingerprintData)
}
}
return {
@@ -96,7 +338,7 @@ export async function saveSessionData(sessionPath: string, browser: BrowserConte
}
}
export async function saveFingerprintData(sessionPath: string, email: string, isMobile: boolean, fingerpint: BrowserFingerprintWithHeaders): Promise<string> {
export async function saveFingerprintData(sessionPath: string, email: string, isMobile: boolean, fingerprint: BrowserFingerprintWithHeaders): Promise<string> {
try {
// Fetch path
const sessionDir = path.join(__dirname, '../browser/', sessionPath, email)
@@ -106,8 +348,12 @@ export async function saveFingerprintData(sessionPath: string, email: string, is
await fs.promises.mkdir(sessionDir, { recursive: true })
}
// Save fingerprint to a file
await fs.promises.writeFile(path.join(sessionDir, `${isMobile ? 'mobile_fingerpint' : 'desktop_fingerpint'}.json`), JSON.stringify(fingerpint))
// Save fingerprint to files (write both legacy and corrected names for compatibility)
const legacy = path.join(sessionDir, `${isMobile ? 'mobile_fingerpint' : 'desktop_fingerpint'}.json`)
const correct = path.join(sessionDir, `${isMobile ? 'mobile_fingerprint' : 'desktop_fingerprint'}.json`)
const payload = JSON.stringify(fingerprint)
await fs.promises.writeFile(correct, payload)
try { await fs.promises.writeFile(legacy, payload) } catch { /* ignore */ }
return sessionDir
} catch (error) {

View File

@@ -1,45 +1,92 @@
import chalk from 'chalk'
import { Webhook } from './Webhook'
import { Ntfy } from './Ntfy'
import { loadConfig } from './Load'
export function log(isMobile: boolean | 'main', title: string, message: string, type: 'log' | 'warn' | 'error' = 'log', color?: keyof typeof chalk): void {
// Synchronous logger that returns an Error when type === 'error' so callers can `throw log(...)` safely.
export function log(isMobile: boolean | 'main', title: string, message: string, type: 'log' | 'warn' | 'error' = 'log', color?: keyof typeof chalk): Error | void {
const configData = loadConfig()
if (configData.logExcludeFunc.some(x => x.toLowerCase() === title.toLowerCase())) {
// Access logging config with fallback for backward compatibility
const configAny = configData as unknown as Record<string, unknown>
const loggingConfig = configAny.logging || configData
const loggingConfigAny = loggingConfig as unknown as Record<string, unknown>
const logExcludeFunc = Array.isArray(loggingConfigAny.excludeFunc) ? loggingConfigAny.excludeFunc :
Array.isArray(loggingConfigAny.logExcludeFunc) ? loggingConfigAny.logExcludeFunc : []
if (Array.isArray(logExcludeFunc) && logExcludeFunc.some((x: string) => x.toLowerCase() === title.toLowerCase())) {
return
}
const currentTime = new Date().toLocaleString()
const platformText = isMobile === 'main' ? 'MAIN' : isMobile ? 'MOBILE' : 'DESKTOP'
const chalkedPlatform = isMobile === 'main' ? chalk.bgCyan('MAIN') : isMobile ? chalk.bgBlue('MOBILE') : chalk.bgMagenta('DESKTOP')
// Clean string for notifications (no chalk, structured)
type LoggingCfg = { excludeFunc?: string[]; webhookExcludeFunc?: string[]; redactEmails?: boolean }
const loggingCfg: LoggingCfg = (configAny.logging || {}) as LoggingCfg
const shouldRedact = !!loggingCfg.redactEmails
const redact = (s: string) => shouldRedact ? s.replace(/[A-Z0-9._%+-]+@[A-Z0-9.-]+\.[A-Z]{2,}/ig, (m) => {
const [u, d] = m.split('@'); return `${(u||'').slice(0,2)}***@${d||''}`
}) : s
const cleanStr = redact(`[${currentTime}] [PID: ${process.pid}] [${type.toUpperCase()}] ${platformText} [${title}] ${message}`)
// Clean string for the Webhook (no chalk)
const cleanStr = `[${currentTime}] [PID: ${process.pid}] [${type.toUpperCase()}] ${platformText} [${title}] ${message}`
// Send the clean string to the Webhook
if (!configData.webhookLogExcludeFunc.some(x => x.toLowerCase() === title.toLowerCase())) {
Webhook(configData, cleanStr)
// Define conditions for sending to NTFY
const ntfyConditions = {
log: [
message.toLowerCase().includes('started tasks for account'),
message.toLowerCase().includes('press the number'),
message.toLowerCase().includes('no points to earn')
],
error: [],
warn: [
message.toLowerCase().includes('aborting'),
message.toLowerCase().includes('didn\'t gain')
]
}
// Formatted string with chalk for terminal logging
const str = `[${currentTime}] [PID: ${process.pid}] [${type.toUpperCase()}] ${chalkedPlatform} [${title}] ${message}`
// Check if the current log type and message meet the NTFY conditions
try {
if (type in ntfyConditions && ntfyConditions[type as keyof typeof ntfyConditions].some(condition => condition)) {
// Fire-and-forget
Promise.resolve(Ntfy(cleanStr, type)).catch(() => { /* ignore ntfy errors */ })
}
} catch { /* ignore */ }
// Console output with better formatting
const typeIndicator = type === 'error' ? '✗' : type === 'warn' ? '⚠' : '●'
const platformColor = isMobile === 'main' ? chalk.cyan : isMobile ? chalk.blue : chalk.magenta
const typeColor = type === 'error' ? chalk.red : type === 'warn' ? chalk.yellow : chalk.green
const formattedStr = [
chalk.gray(`[${currentTime}]`),
chalk.gray(`[${process.pid}]`),
typeColor(`${typeIndicator} ${type.toUpperCase()}`),
platformColor(`[${platformText}]`),
chalk.bold(`[${title}]`),
redact(message)
].join(' ')
const applyChalk = color && typeof chalk[color] === 'function' ? chalk[color] as (msg: string) => string : null
// Log based on the type
switch (type) {
case 'warn':
applyChalk ? console.warn(applyChalk(str)) : console.warn(str)
applyChalk ? console.warn(applyChalk(formattedStr)) : console.warn(formattedStr)
break
case 'error':
applyChalk ? console.error(applyChalk(str)) : console.error(str)
applyChalk ? console.error(applyChalk(formattedStr)) : console.error(formattedStr)
break
default:
applyChalk ? console.log(applyChalk(str)) : console.log(str)
applyChalk ? console.log(applyChalk(formattedStr)) : console.log(formattedStr)
break
}
}
// Return an Error when logging an error so callers can `throw log(...)`
if (type === 'error') {
// CommunityReporter disabled per project policy
return new Error(cleanStr)
}
}

33
src/util/Ntfy.ts Normal file
View File

@@ -0,0 +1,33 @@
import { loadConfig } from './Load'
import axios from 'axios'
const NOTIFICATION_TYPES = {
error: { priority: 'max', tags: 'rotating_light' }, // Customize the ERROR icon here, see: https://docs.ntfy.sh/emojis/
warn: { priority: 'high', tags: 'warning' }, // Customize the WARN icon here, see: https://docs.ntfy.sh/emojis/
log: { priority: 'default', tags: 'medal_sports' } // Customize the LOG icon here, see: https://docs.ntfy.sh/emojis/
}
export async function Ntfy(message: string, type: keyof typeof NOTIFICATION_TYPES = 'log'): Promise<void> {
const config = loadConfig().ntfy
if (!config?.enabled || !config.url || !config.topic) return
try {
const { priority, tags } = NOTIFICATION_TYPES[type]
const headers = {
Title: 'Microsoft Rewards Script',
Priority: priority,
Tags: tags,
...(config.authToken && { Authorization: `Bearer ${config.authToken}` })
}
const response = await axios.post(`${config.url}/${config.topic}`, message, { headers })
if (response.status === 200) {
console.log('NTFY notification successfully sent.')
} else {
console.error(`NTFY notification failed with status ${response.status}`)
}
} catch (error) {
console.error('Failed to send NTFY notification:', error)
}
}

63
src/util/Retry.ts Normal file
View File

@@ -0,0 +1,63 @@
import type { ConfigRetryPolicy } from '../interface/Config'
import Util from './Utils'
type NumericPolicy = {
maxAttempts: number
baseDelay: number
maxDelay: number
multiplier: number
jitter: number
}
export type Retryable<T> = () => Promise<T>
export class Retry {
private policy: NumericPolicy
constructor(policy?: ConfigRetryPolicy) {
const def: NumericPolicy = {
maxAttempts: 3,
baseDelay: 1000,
maxDelay: 30000,
multiplier: 2,
jitter: 0.2
}
const merged: ConfigRetryPolicy = { ...(policy || {}) }
// normalize string durations
const util = new Util()
const parse = (v: number | string) => {
if (typeof v === 'number') return v
try { return util.stringToMs(String(v)) } catch { return def.baseDelay }
}
this.policy = {
maxAttempts: (merged.maxAttempts as number) ?? def.maxAttempts,
baseDelay: parse(merged.baseDelay ?? def.baseDelay),
maxDelay: parse(merged.maxDelay ?? def.maxDelay),
multiplier: (merged.multiplier as number) ?? def.multiplier,
jitter: (merged.jitter as number) ?? def.jitter
}
}
async run<T>(fn: Retryable<T>, isRetryable?: (e: unknown) => boolean): Promise<T> {
let attempt = 0
let delay = this.policy.baseDelay
let lastErr: unknown
while (attempt < this.policy.maxAttempts) {
try {
return await fn()
} catch (e) {
lastErr = e
attempt += 1
const retry = isRetryable ? isRetryable(e) : true
if (!retry || attempt >= this.policy.maxAttempts) break
const jitter = 1 + (Math.random() * 2 - 1) * this.policy.jitter
const sleep = Math.min(this.policy.maxDelay, Math.max(0, Math.floor(delay * jitter)))
await new Promise((r) => setTimeout(r, sleep))
delay = Math.min(this.policy.maxDelay, Math.floor(delay * (this.policy.multiplier || 2)))
}
}
throw lastErr instanceof Error ? lastErr : new Error(String(lastErr))
}
}
export default Retry

84
src/util/Totp.ts Normal file
View File

@@ -0,0 +1,84 @@
import crypto from 'crypto'
/**
* Decode Base32 (RFC 4648) to a Buffer.
* Accepts lowercase/uppercase, optional padding.
*/
function base32Decode(input: string): Buffer {
const alphabet = 'ABCDEFGHIJKLMNOPQRSTUVWXYZ234567'
const clean = input.toUpperCase().replace(/=+$/g, '').replace(/[^A-Z2-7]/g, '')
let bits = 0
let value = 0
const bytes: number[] = []
for (const char of clean) {
const idx = alphabet.indexOf(char)
if (idx < 0) continue
value = (value << 5) | idx
bits += 5
if (bits >= 8) {
bits -= 8
bytes.push((value >>> bits) & 0xff)
}
}
return Buffer.from(bytes)
}
/**
* Generate an HMAC using Node's crypto and return Buffer.
*/
function hmac(algorithm: string, key: Buffer, data: Buffer): Buffer {
return crypto.createHmac(algorithm, key).update(data).digest()
}
export type TotpOptions = { digits?: number; step?: number; algorithm?: 'SHA1' | 'SHA256' | 'SHA512' }
/**
* Generate TOTP per RFC 6238.
* @param secretBase32 - shared secret in Base32
* @param time - Unix time in seconds (defaults to now)
* @param options - { digits, step, algorithm }
* @returns numeric TOTP as string (zero-padded)
*/
export function generateTOTP(
secretBase32: string,
time: number = Math.floor(Date.now() / 1000),
options?: TotpOptions
): string {
const digits = options?.digits ?? 6
const step = options?.step ?? 30
const alg = (options?.algorithm ?? 'SHA1').toUpperCase()
const key = base32Decode(secretBase32)
const counter = Math.floor(time / step)
// 8-byte big-endian counter
const counterBuffer = Buffer.alloc(8)
counterBuffer.writeBigUInt64BE(BigInt(counter), 0)
let hmacAlg: string
if (alg === 'SHA1') hmacAlg = 'sha1'
else if (alg === 'SHA256') hmacAlg = 'sha256'
else if (alg === 'SHA512') hmacAlg = 'sha512'
else throw new Error('Unsupported algorithm. Use SHA1, SHA256 or SHA512.')
const hash = hmac(hmacAlg, key, counterBuffer)
if (!hash || hash.length < 20) {
// Minimal sanity check; for SHA1 length is 20
throw new Error('Invalid HMAC output for TOTP')
}
// Dynamic truncation
const offset = hash[hash.length - 1]! & 0x0f
if (offset + 3 >= hash.length) {
throw new Error('Invalid dynamic truncation offset')
}
const code =
((hash[offset]! & 0x7f) << 24) |
((hash[offset + 1]! & 0xff) << 16) |
((hash[offset + 2]! & 0xff) << 8) |
(hash[offset + 3]! & 0xff)
const otp = (code % 10 ** digits).toString().padStart(digits, '0')
return otp
}

View File

@@ -8,6 +8,11 @@ export default class Util {
})
}
async waitRandom(minMs: number, maxMs: number): Promise<void> {
const delta = this.randomNumber(minMs, maxMs)
return this.wait(delta)
}
getFormattedDate(ms = Date.now()): string {
const today = new Date(ms)
const month = String(today.getMonth() + 1).padStart(2, '0') // January is 0

View File

@@ -1,23 +0,0 @@
import axios from 'axios'
import { Config } from '../interface/Config'
export async function Webhook(configData: Config, content: string) {
const webhook = configData.webhook
if (!webhook.enabled || webhook.url.length < 10) return
const request = {
method: 'POST',
url: webhook.url,
headers: {
'Content-Type': 'application/json'
},
data: {
'content': content
}
}
await axios(request).catch(() => { })
}