initial commit

This commit is contained in:
Илья Глазунов 2026-01-14 22:57:19 +03:00
commit beb9034cd4
50 changed files with 17257 additions and 0 deletions

95
.gitignore vendored Normal file
View File

@ -0,0 +1,95 @@
# Logs
src/data
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
# Runtime data
pids
*.pid
*.seed
*.pid.lock
.DS_Store
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# TypeScript v1 declaration files
typings/
# TypeScript cache
*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
# parcel-bundler cache (https://parceljs.org/)
.cache
# next.js build output
.next
# nuxt.js build output
.nuxt
# vuepress build output
.vuepress/dist
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# Webpack
.webpack/
# Vite
.vite/
# Electron-Forge
out/
.specstory
.specstory/

1
.npmrc Normal file
View File

@ -0,0 +1 @@
node-linker=hoisted

2
.prettierignore Normal file
View File

@ -0,0 +1,2 @@
src/assets
node_modules

10
.prettierrc Normal file
View File

@ -0,0 +1,10 @@
{
"semi": true,
"tabWidth": 4,
"printWidth": 150,
"singleQuote": true,
"trailingComma": "es5",
"bracketSpacing": true,
"arrowParens": "avoid",
"endOfLine": "lf"
}

130
AGENTS.md Normal file
View File

@ -0,0 +1,130 @@
# Repo Guidelines
This repository is a fork of [`cheating-daddy`](https://github.com/sohzm/cheating-daddy).
It provides an Electron-based realtime assistant which captures screen and audio
for contextual AI responses. The code is JavaScript and uses Electron Forge for
packaging.
## Getting started
Install dependencies and run the development app:
```
1. npm install
2. npm start
```
## Style
Run `npx prettier --write .` before committing. Prettier uses the settings in
`.prettierrc` (four-space indentation, print width 150, semicolons and single
quotes). `src/assets` and `node_modules` are ignored via `.prettierignore`.
The project does not provide linting; `npm run lint` simply prints
"No linting configured".
## Code standards
Development is gradually migrating toward a TypeScript/React codebase inspired by the
[transcriber](https://github.com/Gatecrashah/transcriber) project. Keep the following
rules in mind as new files are created:
- **TypeScript strict mode** avoid `any` and prefer explicit interfaces.
- **React components** should be functional with hooks and wrapped in error
boundaries where appropriate.
- **Secure IPC** validate and sanitize all parameters crossing the renderer/main
boundary.
- **Nonblocking audio** heavy processing must stay off the UI thread.
- **Tests** every new feature requires tests once the test suite is available.
## Shadcn and Electron
The interface is being rebuilt with [shadcn/ui](https://ui.shadcn.com) components.
Follow these guidelines when working on UI code:
- **Component directory** place generated files under `src/components/ui` and export them from that folder.
- **Add components with the CLI** run `npx shadcn@latest add <component>`; never hand-roll components.
- **Component pattern** use `React.forwardRef` with the `cn()` helper for class names.
- **Path aliases** import modules from `src` using the `@/` prefix.
- **React 19 + Compiler** target React 19 with the new compiler when available.
- **Context isolation** maintain Electron's context isolation pattern for IPC.
- **TypeScript strict mode** run `npm run typecheck` before claiming work complete.
- **Tailwind theming** rely on CSS variables and utilities in `@/utils/tailwind` for styling.
- **Testing without running** confirm `npm run typecheck` and module resolution with `node -e "require('<file>')"`.
## Tests
No automated tests yet. When a suite is added, run `npm test` before each
commit. Until then, at minimum ensure `npm install` and `npm start` work after
merging upstream changes.
## Merging upstream PRs
Pull requests from <https://github.com/sohzm/cheating-daddy> are commonly
cherrypicked here. When merging:
1. Inspect the diff and keep commit messages short (`feat:` / `fix:` etc.).
2. After merging, run the application locally to verify it still builds and
functions.
## Strategy and Future Work
We plan to extend this project with ideas from the
[`transcriber`](https://github.com/Gatecrashah/transcriber) project which also
uses Electron. Key goals are:
- **Local Transcription** integrate `whisper.cpp` to allow offline speech-to-
text. Investigate the architecture used in `transcriber/src/main` for model
validation and GPU acceleration.
- **Dual Audio Capture** capture microphone and system audio simultaneously.
`transcriber` shows one approach using a native helper for macOS and
Electron's `getDisplayMedia` for other platforms.
- **Speaker Diarization** explore tinydiarize for identifying speakers in mono
audio streams.
- **Voice Activity Detection** skip silent or lowquality segments before
sending to the AI service.
- **Improved Note Handling** store transcriptions locally and associate them
with meeting notes, similar to `transcriber`'s note management system.
- **Testing Infrastructure** adopt Jest and React Testing Library (if React is
introduced) to cover audio capture and transcription modules.
### TODO
1. Research and prototype local transcription using `whisper.cpp`.
2. Add dualstream audio capture logic for crossplatform support.
3. Investigate speaker diarization options and integrate when feasible.
4. Plan a migration path toward a proper testing setup (Jest or similar).
5. Document security considerations for audio storage and processing.
6. Rebuild the entire UI using shadcn components.
These plans are aspirational; implement them gradually while keeping the app
functional.
## Audio processing principles
When implementing transcription features borrow the following rules from
`transcriber`:
- **16 kHz compatibility** resample all audio before sending to whisper.cpp.
- **Dualstream architecture** capture microphone and system audio on separate
channels.
- **Speaker diarization** integrate tinydiarize (`--tinydiarize` flag) for mono
audio and parse `[SPEAKER_TURN]` markers to label speakers (Speaker A, B, C…).
- **Voice activity detection** prefilter silent segments to improve speed.
- **Quality preservation** keep sample fidelity and avoid blocking the UI
during heavy processing.
- **Memory efficiency** stream large audio files instead of loading them all at
once.
- **Error recovery** handle audio device failures gracefully.
## Privacy by design
- **Local processing** transcriptions should happen locally whenever possible.
- **User control** provide clear options for data retention and deletion.
- **Transparency** document what is stored and where.
- **Minimal data** only persist what is required for functionality.
## LLM plans
There are placeholder files for future LLM integration (e.g. Qwen models via
`llama.cpp`). Continue development after the core transcription pipeline is
stable and ensure tests cover this new functionality.

674
LICENSE Normal file
View File

@ -0,0 +1,674 @@
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>.

60
README.md Normal file
View File

@ -0,0 +1,60 @@
<img width="1299" height="424" alt="cd (1)" src="https://github.com/user-attachments/assets/b25fff4d-043d-4f38-9985-f832ae0d0f6e" />
## Recall.ai - API for desktop recording
If youre looking for a hosted desktop recording API, consider checking out [Recall.ai](https://www.recall.ai/product/desktop-recording-sdk/?utm_source=github&utm_medium=sponsorship&utm_campaign=sohzm-cheating-daddy), an API that records Zoom, Google Meet, Microsoft Teams, in-person meetings, and more.
This project is sponsored by Recall.ai.
---
> [!NOTE]
> Use latest MacOS and Windows version, older versions have limited support
> [!NOTE]
> During testing it wont answer if you ask something, you need to simulate interviewer asking question, which it will answer
A real-time AI assistant that provides contextual help during video calls, interviews, presentations, and meetings using screen capture and audio analysis.
## Features
- **Live AI Assistance**: Real-time help powered by Google Gemini 2.0 Flash Live
- **Screen & Audio Capture**: Analyzes what you see and hear for contextual responses
- **Multiple Profiles**: Interview, Sales Call, Business Meeting, Presentation, Negotiation
- **Transparent Overlay**: Always-on-top window that can be positioned anywhere
- **Click-through Mode**: Make window transparent to clicks when needed
- **Cross-platform**: Works on macOS, Windows, and Linux (kinda, dont use, just for testing rn)
## Setup
1. **Get a Gemini API Key**: Visit [Google AI Studio](https://aistudio.google.com/apikey)
2. **Install Dependencies**: `npm install`
3. **Run the App**: `npm start`
## Usage
1. Enter your Gemini API key in the main window
2. Choose your profile and language in settings
3. Click "Start Session" to begin
4. Position the window using keyboard shortcuts
5. The AI will provide real-time assistance based on your screen and what interview asks
## Keyboard Shortcuts
- **Window Movement**: `Ctrl/Cmd + Arrow Keys` - Move window
- **Click-through**: `Ctrl/Cmd + M` - Toggle mouse events
- **Close/Back**: `Ctrl/Cmd + \` - Close window or go back
- **Send Message**: `Enter` - Send text to AI
## Audio Capture
- **macOS**: [SystemAudioDump](https://github.com/Mohammed-Yasin-Mulla/Sound) for system audio
- **Windows**: Loopback audio capture
- **Linux**: Microphone input
## Requirements
- Electron-compatible OS (macOS, Windows, Linux)
- Gemini API key
- Screen recording permissions
- Microphone/audio permissions

22
entitlements.plist Normal file
View File

@ -0,0 +1,22 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.cs.allow-jit</key>
<true/>
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<key>com.apple.security.cs.debugger</key>
<true/>
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
<key>com.apple.security.device.audio-input</key>
<true/>
<key>com.apple.security.device.microphone</key>
<true/>
<key>com.apple.security.network.client</key>
<true/>
<key>com.apple.security.network.server</key>
<true/>
</dict>
</plist>

76
forge.config.js Normal file
View File

@ -0,0 +1,76 @@
const { FusesPlugin } = require('@electron-forge/plugin-fuses');
const { FuseV1Options, FuseVersion } = require('@electron/fuses');
module.exports = {
packagerConfig: {
asar: true,
extraResource: ['./src/assets/SystemAudioDump'],
name: 'Cheating Daddy',
icon: 'src/assets/logo',
// use `security find-identity -v -p codesigning` to find your identity
// for macos signing
// also fuck apple
// osxSign: {
// identity: '<paste your identity here>',
// optionsForFile: (filePath) => {
// return {
// entitlements: 'entitlements.plist',
// };
// },
// },
// notarize if off cuz i ran this for 6 hours and it still didnt finish
// osxNotarize: {
// appleId: 'your apple id',
// appleIdPassword: 'app specific password',
// teamId: 'your team id',
// },
},
rebuildConfig: {},
makers: [
{
name: '@electron-forge/maker-squirrel',
config: {
name: 'cheating-daddy',
productName: 'Cheating Daddy',
shortcutName: 'Cheating Daddy',
createDesktopShortcut: true,
createStartMenuShortcut: true,
},
},
{
name: '@electron-forge/maker-dmg',
platforms: ['darwin'],
},
{
name: '@reforged/maker-appimage',
platforms: ['linux'],
config: {
options: {
name: 'Cheating Daddy',
productName: 'Cheating Daddy',
genericName: 'AI Assistant',
description: 'AI assistant for interviews and learning',
categories: ['Development', 'Education'],
icon: 'src/assets/logo.png'
}
},
},
],
plugins: [
{
name: '@electron-forge/plugin-auto-unpack-natives',
config: {},
},
// Fuses are used to enable/disable various Electron functionality
// at package time, before code signing the application
new FusesPlugin({
version: FuseVersion.V1,
[FuseV1Options.RunAsNode]: false,
[FuseV1Options.EnableCookieEncryption]: true,
[FuseV1Options.EnableNodeOptionsEnvironmentVariable]: false,
[FuseV1Options.EnableNodeCliInspectArguments]: false,
[FuseV1Options.EnableEmbeddedAsarIntegrityValidation]: true,
[FuseV1Options.OnlyLoadAppFromAsar]: true,
}),
],
};

45
package.json Normal file
View File

@ -0,0 +1,45 @@
{
"name": "cheating-daddy",
"productName": "cheating-daddy",
"version": "0.5.0",
"description": "cheating daddy",
"main": "src/index.js",
"scripts": {
"start": "electron-forge start",
"package": "electron-forge package",
"make": "electron-forge make",
"publish": "electron-forge publish",
"lint": "echo \"No linting configured\""
},
"keywords": [
"cheating daddy",
"cheating daddy ai",
"cheating daddy ai assistant",
"cheating daddy ai assistant for interviews",
"cheating daddy ai assistant for interviews"
],
"author": {
"name": "sohzm",
"email": "sohambharambe9@gmail.com"
},
"license": "GPL-3.0",
"dependencies": {
"@google/genai": "^1.35.0",
"electron-squirrel-startup": "^1.0.1",
"openai": "^6.16.0",
"ws": "^8.18.0"
},
"devDependencies": {
"@electron-forge/cli": "^7.11.1",
"@electron-forge/maker-deb": "^7.11.1",
"@electron-forge/maker-dmg": "^7.11.1",
"@electron-forge/maker-rpm": "^7.11.1",
"@electron-forge/maker-squirrel": "^7.11.1",
"@electron-forge/maker-zip": "^7.11.1",
"@electron-forge/plugin-auto-unpack-natives": "^7.11.1",
"@electron-forge/plugin-fuses": "^7.11.1",
"@electron/fuses": "^2.0.0",
"@reforged/maker-appimage": "^5.1.1",
"electron": "^39.2.7"
}
}

4717
pnpm-lock.yaml generated Normal file

File diff suppressed because it is too large Load Diff

BIN
src/assets/SystemAudioDump Executable file

Binary file not shown.

1213
src/assets/highlight-11.9.0.min.js vendored Normal file

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,10 @@
pre code.hljs{display:block;overflow-x:auto;padding:1em}code.hljs{padding:3px 5px}/*!
Theme: GitHub Dark
Description: Dark theme as seen on github.com
Author: github.com
Maintainer: @Hirse
Updated: 2021-05-15
Outdated base version: https://github.com/primer/github-syntax-dark
Current colors taken from GitHub's CSS
*/.hljs{color:#c9d1d9;background:#0d1117}.hljs-doctag,.hljs-keyword,.hljs-meta .hljs-keyword,.hljs-template-tag,.hljs-template-variable,.hljs-type,.hljs-variable.language_{color:#ff7b72}.hljs-title,.hljs-title.class_,.hljs-title.class_.inherited__,.hljs-title.function_{color:#d2a8ff}.hljs-attr,.hljs-attribute,.hljs-literal,.hljs-meta,.hljs-number,.hljs-operator,.hljs-selector-attr,.hljs-selector-class,.hljs-selector-id,.hljs-variable{color:#79c0ff}.hljs-meta .hljs-string,.hljs-regexp,.hljs-string{color:#a5d6ff}.hljs-built_in,.hljs-symbol{color:#ffa657}.hljs-code,.hljs-comment,.hljs-formula{color:#8b949e}.hljs-name,.hljs-quote,.hljs-selector-pseudo,.hljs-selector-tag{color:#7ee787}.hljs-subst{color:#c9d1d9}.hljs-section{color:#1f6feb;font-weight:700}.hljs-bullet{color:#f2cc60}.hljs-emphasis{color:#c9d1d9;font-style:italic}.hljs-strong{color:#c9d1d9;font-weight:700}.hljs-addition{color:#aff5b4;background-color:#033a16}.hljs-deletion{color:#ffdcd7;background-color:#67060c}

126
src/assets/lit-all-2.7.4.min.js vendored Normal file

File diff suppressed because one or more lines are too long

27
src/assets/lit-core-2.7.4.min.js vendored Normal file

File diff suppressed because one or more lines are too long

BIN
src/assets/logo.icns Normal file

Binary file not shown.

BIN
src/assets/logo.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 176 KiB

BIN
src/assets/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 28 KiB

6
src/assets/marked-4.3.0.min.js vendored Normal file

File diff suppressed because one or more lines are too long

Binary file not shown.

BIN
src/assets/old/0.3/logo.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 173 KiB

BIN
src/assets/old/0.3/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

View File

@ -0,0 +1 @@
<?xml version="1.0" encoding="UTF-8"?><svg width="64px" height="64px" viewBox="0 0 24 24" stroke-width="1" fill="none" xmlns="http://www.w3.org/2000/svg" color="#ffffff"><path d="M14.3632 5.65156L15.8431 4.17157C16.6242 3.39052 17.8905 3.39052 18.6716 4.17157L20.0858 5.58579C20.8668 6.36683 20.8668 7.63316 20.0858 8.41421L18.6058 9.8942M14.3632 5.65156L4.74749 15.2672C4.41542 15.5993 4.21079 16.0376 4.16947 16.5054L3.92738 19.2459C3.87261 19.8659 4.39148 20.3848 5.0115 20.33L7.75191 20.0879C8.21972 20.0466 8.65806 19.8419 8.99013 19.5099L18.6058 9.8942M14.3632 5.65156L18.6058 9.8942" stroke="#ffffff" stroke-width="1" stroke-linecap="round" stroke-linejoin="round"></path></svg>

After

Width:  |  Height:  |  Size: 686 B

View File

@ -0,0 +1 @@
<?xml version="1.0" encoding="UTF-8"?><svg width="64px" height="64px" stroke-width="1" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg" color="#ffffff"><path d="M12 15C13.6569 15 15 13.6569 15 12C15 10.3431 13.6569 9 12 9C10.3431 9 9 10.3431 9 12C9 13.6569 10.3431 15 12 15Z" stroke="#ffffff" stroke-width="1" stroke-linecap="round" stroke-linejoin="round"></path><path d="M19.6224 10.3954L18.5247 7.7448L20 6L18 4L16.2647 5.48295L13.5578 4.36974L12.9353 2H10.981L10.3491 4.40113L7.70441 5.51596L6 4L4 6L5.45337 7.78885L4.3725 10.4463L2 11V13L4.40111 13.6555L5.51575 16.2997L4 18L6 20L7.79116 18.5403L10.397 19.6123L11 22H13L13.6045 19.6132L16.2551 18.5155C16.6969 18.8313 18 20 18 20L20 18L18.5159 16.2494L19.6139 13.598L21.9999 12.9772L22 11L19.6224 10.3954Z" stroke="#ffffff" stroke-width="1" stroke-linecap="round" stroke-linejoin="round"></path></svg>

After

Width:  |  Height:  |  Size: 875 B

View File

@ -0,0 +1 @@
<?xml version="1.0" encoding="UTF-8"?><svg width="64px" height="64px" stroke-width="1" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg" color="#ffffff"><path d="M22 8.86222C22 10.4087 21.4062 11.8941 20.3458 12.9929C17.9049 15.523 15.5374 18.1613 13.0053 20.5997C12.4249 21.1505 11.5042 21.1304 10.9488 20.5547L3.65376 12.9929C1.44875 10.7072 1.44875 7.01723 3.65376 4.73157C5.88044 2.42345 9.50794 2.42345 11.7346 4.73157L11.9998 5.00642L12.2648 4.73173C13.3324 3.6245 14.7864 3 16.3053 3C17.8242 3 19.2781 3.62444 20.3458 4.73157C21.4063 5.83045 22 7.31577 22 8.86222Z" stroke="#ffffff" stroke-width="1" stroke-linejoin="round"></path></svg>

After

Width:  |  Height:  |  Size: 662 B

View File

@ -0,0 +1 @@
<?xml version="1.0" encoding="UTF-8"?><svg width="64px" height="64px" stroke-width="1" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg" color="#ffffff"><path d="M8.5 11.5L11.5 14.5L16.5 9.5" stroke="#ffffff" stroke-width="1" stroke-linecap="round" stroke-linejoin="round"></path><path d="M5 18L3.13036 4.91253C3.05646 4.39524 3.39389 3.91247 3.90398 3.79912L11.5661 2.09641C11.8519 2.03291 12.1481 2.03291 12.4339 2.09641L20.096 3.79912C20.6061 3.91247 20.9435 4.39524 20.8696 4.91252L19 18C18.9293 18.495 18.5 21.5 12 21.5C5.5 21.5 5.07071 18.495 5 18Z" stroke="#ffffff" stroke-width="1" stroke-linecap="round" stroke-linejoin="round"></path></svg>

After

Width:  |  Height:  |  Size: 668 B

View File

@ -0,0 +1 @@
<?xml version="1.0" encoding="UTF-8"?><svg width="64px" height="64px" viewBox="0 0 24 24" stroke-width="1" fill="none" xmlns="http://www.w3.org/2000/svg" color="#ffffff"><path d="M14.1488 9.47163V3.61153C14.1488 2.72151 13.4273 2 12.5373 2V2C11.6473 2 10.9258 2.72151 10.9258 3.61153V8.44611" stroke="#ffffff" stroke-width="1" stroke-linecap="round"></path><path d="M16.346 12.841L18.5217 5.58862C18.7755 4.74265 18.2886 3.85248 17.4394 3.60984V3.60984C16.5943 3.3684 15.7142 3.8609 15.4779 4.70743L14.1484 9.47149" stroke="#ffffff" stroke-width="1" stroke-linecap="round"></path><path d="M7.61935 9.24985L8.67489 11.5913C9.03961 12.4003 8.68159 13.352 7.87404 13.72C7.06183 14.0901 6.10347 13.7296 5.73663 12.9159L4.68109 10.5745C4.31637 9.76542 4.67439 8.81376 5.48193 8.44574C6.29415 8.07559 7.25251 8.43614 7.61935 9.24985Z" stroke="#ffffff" stroke-width="1" stroke-linecap="round"></path><path d="M11.7192 12.2615V12.2615C11.9239 11.694 11.8998 11.0692 11.6518 10.5192L10.5787 8.13874C10.2181 7.33892 9.27613 6.98454 8.4778 7.34836V7.34836C7.66469 7.71892 7.31885 8.68832 7.71382 9.48986L7.84946 9.76511" stroke="#ffffff" stroke-width="1" stroke-linecap="round"></path><path d="M13.8566 17.6767L14.3487 16.6927C14.3976 16.5947 14.3461 16.4763 14.241 16.4454L10.6903 15.4011C9.97853 15.1918 9.51797 14.5038 9.59563 13.766V13.766C9.68372 12.9292 10.4284 12.3188 11.2662 12.3968L16.0542 12.8422C16.0542 12.8422 19.8632 13.4282 18.5447 17.2372C17.2262 21.0463 16.7867 22.3648 13.8566 22.3648C11.9521 22.3648 9.16855 22.3648 9.16855 22.3648H8.87555C6.52912 22.3648 4.62697 20.4627 4.62697 18.1163V18.1163L4.48047 9.91211" stroke="#ffffff" stroke-width="1" stroke-linecap="round"></path></svg>

After

Width:  |  Height:  |  Size: 1.7 KiB

135
src/audioUtils.js Normal file
View File

@ -0,0 +1,135 @@
const fs = require('fs');
const path = require('path');
// Convert raw PCM to WAV format for easier playback and verification
function pcmToWav(pcmBuffer, outputPath, sampleRate = 24000, channels = 1, bitDepth = 16) {
const byteRate = sampleRate * channels * (bitDepth / 8);
const blockAlign = channels * (bitDepth / 8);
const dataSize = pcmBuffer.length;
// Create WAV header
const header = Buffer.alloc(44);
// "RIFF" chunk descriptor
header.write('RIFF', 0);
header.writeUInt32LE(dataSize + 36, 4); // File size - 8
header.write('WAVE', 8);
// "fmt " sub-chunk
header.write('fmt ', 12);
header.writeUInt32LE(16, 16); // Subchunk1Size (16 for PCM)
header.writeUInt16LE(1, 20); // AudioFormat (1 for PCM)
header.writeUInt16LE(channels, 22); // NumChannels
header.writeUInt32LE(sampleRate, 24); // SampleRate
header.writeUInt32LE(byteRate, 28); // ByteRate
header.writeUInt16LE(blockAlign, 32); // BlockAlign
header.writeUInt16LE(bitDepth, 34); // BitsPerSample
// "data" sub-chunk
header.write('data', 36);
header.writeUInt32LE(dataSize, 40); // Subchunk2Size
// Combine header and PCM data
const wavBuffer = Buffer.concat([header, pcmBuffer]);
// Write to file
fs.writeFileSync(outputPath, wavBuffer);
return outputPath;
}
// Analyze audio buffer for debugging
function analyzeAudioBuffer(buffer, label = 'Audio') {
const int16Array = new Int16Array(buffer.buffer, buffer.byteOffset, buffer.length / 2);
let minValue = 32767;
let maxValue = -32768;
let avgValue = 0;
let rmsValue = 0;
let silentSamples = 0;
for (let i = 0; i < int16Array.length; i++) {
const sample = int16Array[i];
minValue = Math.min(minValue, sample);
maxValue = Math.max(maxValue, sample);
avgValue += sample;
rmsValue += sample * sample;
if (Math.abs(sample) < 100) {
silentSamples++;
}
}
avgValue /= int16Array.length;
rmsValue = Math.sqrt(rmsValue / int16Array.length);
const silencePercentage = (silentSamples / int16Array.length) * 100;
console.log(`${label} Analysis:`);
console.log(` Samples: ${int16Array.length}`);
console.log(` Min: ${minValue}, Max: ${maxValue}`);
console.log(` Average: ${avgValue.toFixed(2)}`);
console.log(` RMS: ${rmsValue.toFixed(2)}`);
console.log(` Silence: ${silencePercentage.toFixed(1)}%`);
console.log(` Dynamic Range: ${20 * Math.log10(maxValue / (rmsValue || 1))} dB`);
return {
minValue,
maxValue,
avgValue,
rmsValue,
silencePercentage,
sampleCount: int16Array.length,
};
}
// Save audio buffer with metadata for debugging
function saveDebugAudio(buffer, type, timestamp = Date.now()) {
const homeDir = require('os').homedir();
const debugDir = path.join(homeDir, 'cheating-daddy-debug');
if (!fs.existsSync(debugDir)) {
fs.mkdirSync(debugDir, { recursive: true });
}
const pcmPath = path.join(debugDir, `${type}_${timestamp}.pcm`);
const wavPath = path.join(debugDir, `${type}_${timestamp}.wav`);
const metaPath = path.join(debugDir, `${type}_${timestamp}.json`);
// Save raw PCM
fs.writeFileSync(pcmPath, buffer);
// Convert to WAV for easy playback
pcmToWav(buffer, wavPath);
// Analyze and save metadata
const analysis = analyzeAudioBuffer(buffer, type);
fs.writeFileSync(
metaPath,
JSON.stringify(
{
timestamp,
type,
bufferSize: buffer.length,
analysis,
format: {
sampleRate: 24000,
channels: 1,
bitDepth: 16,
},
},
null,
2
)
);
console.log(`Debug audio saved: ${wavPath}`);
return { pcmPath, wavPath, metaPath };
}
module.exports = {
pcmToWav,
analyzeAudioBuffer,
saveDebugAudio,
};

View File

@ -0,0 +1,340 @@
import { html, css, LitElement } from '../../assets/lit-core-2.7.4.min.js';
export class AppHeader extends LitElement {
static styles = css`
* {
font-family: 'Inter', -apple-system, BlinkMacSystemFont, sans-serif;
cursor: default;
user-select: none;
}
.header {
-webkit-app-region: drag;
display: flex;
align-items: center;
padding: var(--header-padding);
background: var(--header-background);
border-bottom: 1px solid var(--border-color);
}
.header-title {
flex: 1;
font-size: var(--header-font-size);
font-weight: 500;
color: var(--text-color);
-webkit-app-region: drag;
}
.header-actions {
display: flex;
gap: var(--header-gap);
align-items: center;
-webkit-app-region: no-drag;
}
.header-actions span {
font-size: var(--header-font-size-small);
color: var(--text-secondary);
}
.button {
background: transparent;
color: var(--text-color);
border: 1px solid var(--border-color);
padding: var(--header-button-padding);
border-radius: 3px;
font-size: var(--header-font-size-small);
font-weight: 500;
transition: background 0.1s ease;
}
.button:hover {
background: var(--hover-background);
}
.icon-button {
background: transparent;
color: var(--text-secondary);
border: none;
padding: var(--header-icon-padding);
border-radius: 3px;
font-size: var(--header-font-size-small);
font-weight: 500;
display: flex;
align-items: center;
justify-content: center;
transition: all 0.1s ease;
}
.icon-button svg {
width: var(--icon-size);
height: var(--icon-size);
}
.icon-button:hover {
background: var(--hover-background);
color: var(--text-color);
}
:host([isclickthrough]) .button:hover,
:host([isclickthrough]) .icon-button:hover {
background: transparent;
}
.key {
background: var(--key-background);
padding: 2px 6px;
border-radius: 3px;
font-size: 11px;
font-family: 'SF Mono', Monaco, monospace;
}
.click-through-indicator {
font-size: 10px;
color: var(--text-muted);
background: var(--key-background);
padding: 2px 6px;
border-radius: 3px;
font-family: 'SF Mono', Monaco, monospace;
}
.update-button {
background: transparent;
color: #f14c4c;
border: 1px solid #f14c4c;
padding: var(--header-button-padding);
border-radius: 3px;
font-size: var(--header-font-size-small);
font-weight: 500;
display: flex;
align-items: center;
gap: 4px;
transition: all 0.1s ease;
}
.update-button svg {
width: 14px;
height: 14px;
}
.update-button:hover {
background: rgba(241, 76, 76, 0.1);
}
`;
static properties = {
currentView: { type: String },
statusText: { type: String },
startTime: { type: Number },
onCustomizeClick: { type: Function },
onHelpClick: { type: Function },
onHistoryClick: { type: Function },
onCloseClick: { type: Function },
onBackClick: { type: Function },
onHideToggleClick: { type: Function },
isClickThrough: { type: Boolean, reflect: true },
updateAvailable: { type: Boolean },
};
constructor() {
super();
this.currentView = 'main';
this.statusText = '';
this.startTime = null;
this.onCustomizeClick = () => {};
this.onHelpClick = () => {};
this.onHistoryClick = () => {};
this.onCloseClick = () => {};
this.onBackClick = () => {};
this.onHideToggleClick = () => {};
this.isClickThrough = false;
this.updateAvailable = false;
this._timerInterval = null;
}
connectedCallback() {
super.connectedCallback();
this._startTimer();
this._checkForUpdates();
}
async _checkForUpdates() {
try {
const currentVersion = await cheatingDaddy.getVersion();
const response = await fetch('https://raw.githubusercontent.com/sohzm/cheating-daddy/refs/heads/master/package.json');
if (!response.ok) return;
const remotePackage = await response.json();
const remoteVersion = remotePackage.version;
if (this._isNewerVersion(remoteVersion, currentVersion)) {
this.updateAvailable = true;
}
} catch (err) {
console.log('Update check failed:', err.message);
}
}
_isNewerVersion(remote, current) {
const remoteParts = remote.split('.').map(Number);
const currentParts = current.split('.').map(Number);
for (let i = 0; i < Math.max(remoteParts.length, currentParts.length); i++) {
const r = remoteParts[i] || 0;
const c = currentParts[i] || 0;
if (r > c) return true;
if (r < c) return false;
}
return false;
}
async _openUpdatePage() {
const { ipcRenderer } = require('electron');
await ipcRenderer.invoke('open-external', 'https://cheatingdaddy.com');
}
disconnectedCallback() {
super.disconnectedCallback();
this._stopTimer();
}
updated(changedProperties) {
super.updated(changedProperties);
// Start/stop timer based on view change
if (changedProperties.has('currentView')) {
if (this.currentView === 'assistant' && this.startTime) {
this._startTimer();
} else {
this._stopTimer();
}
}
// Start timer when startTime is set
if (changedProperties.has('startTime')) {
if (this.startTime && this.currentView === 'assistant') {
this._startTimer();
} else if (!this.startTime) {
this._stopTimer();
}
}
}
_startTimer() {
// Clear any existing timer
this._stopTimer();
// Only start timer if we're in assistant view and have a start time
if (this.currentView === 'assistant' && this.startTime) {
this._timerInterval = setInterval(() => {
// Trigger a re-render by requesting an update
this.requestUpdate();
}, 1000); // Update every second
}
}
_stopTimer() {
if (this._timerInterval) {
clearInterval(this._timerInterval);
this._timerInterval = null;
}
}
getViewTitle() {
const titles = {
onboarding: 'Welcome to Cheating Daddy',
main: 'Cheating Daddy',
customize: 'Customize',
help: 'Help & Shortcuts',
history: 'Conversation History',
advanced: 'Advanced Tools',
assistant: 'Cheating Daddy',
};
return titles[this.currentView] || 'Cheating Daddy';
}
getElapsedTime() {
if (this.currentView === 'assistant' && this.startTime) {
const elapsed = Math.floor((Date.now() - this.startTime) / 1000);
if (elapsed >= 60) {
const minutes = Math.floor(elapsed / 60);
const seconds = elapsed % 60;
return `${minutes}m ${seconds}s`;
}
return `${elapsed}s`;
}
return '';
}
isNavigationView() {
const navigationViews = ['customize', 'help', 'history', 'advanced'];
return navigationViews.includes(this.currentView);
}
render() {
const elapsedTime = this.getElapsedTime();
return html`
<div class="header">
<div class="header-title">${this.getViewTitle()}</div>
<div class="header-actions">
${this.currentView === 'assistant'
? html`
<span>${elapsedTime}</span>
<span>${this.statusText}</span>
${this.isClickThrough ? html`<span class="click-through-indicator">click-through</span>` : ''}
`
: ''}
${this.currentView === 'main'
? html`
${this.updateAvailable ? html`
<button class="update-button" @click=${this._openUpdatePage}>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" fill="currentColor">
<path fill-rule="evenodd" d="M13.836 2.477a.75.75 0 0 1 .75.75v3.182a.75.75 0 0 1-.75.75h-3.182a.75.75 0 0 1 0-1.5h1.37l-.84-.841a4.5 4.5 0 0 0-7.08.932.75.75 0 0 1-1.3-.75 6 6 0 0 1 9.44-1.242l.842.84V3.227a.75.75 0 0 1 .75-.75Zm-.911 7.5A.75.75 0 0 1 13.199 11a6 6 0 0 1-9.44 1.241l-.84-.84v1.371a.75.75 0 0 1-1.5 0V9.591a.75.75 0 0 1 .75-.75H5.35a.75.75 0 0 1 0 1.5H3.98l.841.841a4.5 4.5 0 0 0 7.08-.932.75.75 0 0 1 1.025-.273Z" clip-rule="evenodd" />
</svg>
Update available
</button>
` : ''}
<button class="icon-button" @click=${this.onHistoryClick}>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor">
<path fill-rule="evenodd" d="M10 18a8 8 0 1 0 0-16 8 8 0 0 0 0 16Zm.75-13a.75.75 0 0 0-1.5 0v5c0 .414.336.75.75.75h4a.75.75 0 0 0 0-1.5h-3.25V5Z" clip-rule="evenodd" />
</svg>
</button>
<button class="icon-button" @click=${this.onCustomizeClick}>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor">
<path fill-rule="evenodd" d="M7.84 1.804A1 1 0 0 1 8.82 1h2.36a1 1 0 0 1 .98.804l.331 1.652a6.993 6.993 0 0 1 1.929 1.115l1.598-.54a1 1 0 0 1 1.186.447l1.18 2.044a1 1 0 0 1-.205 1.251l-1.267 1.113a7.047 7.047 0 0 1 0 2.228l1.267 1.113a1 1 0 0 1 .206 1.25l-1.18 2.045a1 1 0 0 1-1.187.447l-1.598-.54a6.993 6.993 0 0 1-1.929 1.115l-.33 1.652a1 1 0 0 1-.98.804H8.82a1 1 0 0 1-.98-.804l-.331-1.652a6.993 6.993 0 0 1-1.929-1.115l-1.598.54a1 1 0 0 1-1.186-.447l-1.18-2.044a1 1 0 0 1 .205-1.251l1.267-1.114a7.05 7.05 0 0 1 0-2.227L1.821 7.773a1 1 0 0 1-.206-1.25l1.18-2.045a1 1 0 0 1 1.187-.447l1.598.54A6.992 6.992 0 0 1 7.51 3.456l.33-1.652ZM10 13a3 3 0 1 0 0-6 3 3 0 0 0 0 6Z" clip-rule="evenodd" />
</svg>
</button>
<button class="icon-button" @click=${this.onHelpClick}>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor">
<path fill-rule="evenodd" d="M18 10a8 8 0 1 1-16 0 8 8 0 0 1 16 0ZM8.94 6.94a.75.75 0 1 1-1.061-1.061 3 3 0 1 1 2.871 5.026v.345a.75.75 0 0 1-1.5 0v-.5c0-.72.57-1.172 1.081-1.287A1.5 1.5 0 1 0 8.94 6.94ZM10 15a1 1 0 1 0 0-2 1 1 0 0 0 0 2Z" clip-rule="evenodd" />
</svg>
</button>
`
: ''}
${this.currentView === 'assistant'
? html`
<button @click=${this.onHideToggleClick} class="button">
Hide&nbsp;&nbsp;<span class="key" style="pointer-events: none;">${cheatingDaddy.isMacOS ? 'Cmd' : 'Ctrl'}</span
>&nbsp;&nbsp;<span class="key">&bsol;</span>
</button>
<button @click=${this.onCloseClick} class="icon-button window-close">
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor">
<path d="M6.28 5.22a.75.75 0 0 0-1.06 1.06L8.94 10l-3.72 3.72a.75.75 0 1 0 1.06 1.06L10 11.06l3.72 3.72a.75.75 0 1 0 1.06-1.06L11.06 10l3.72-3.72a.75.75 0 0 0-1.06-1.06L10 8.94 6.28 5.22Z" />
</svg>
</button>
`
: html`
<button @click=${this.isNavigationView() ? this.onBackClick : this.onCloseClick} class="icon-button window-close">
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor">
<path d="M6.28 5.22a.75.75 0 0 0-1.06 1.06L8.94 10l-3.72 3.72a.75.75 0 1 0 1.06 1.06L10 11.06l3.72 3.72a.75.75 0 1 0 1.06-1.06L11.06 10l3.72-3.72a.75.75 0 0 0-1.06-1.06L10 8.94 6.28 5.22Z" />
</svg>
</button>
`}
</div>
</div>
`;
}
}
customElements.define('app-header', AppHeader);

View File

@ -0,0 +1,568 @@
import { html, css, LitElement } from '../../assets/lit-core-2.7.4.min.js';
import { AppHeader } from './AppHeader.js';
import { MainView } from '../views/MainView.js';
import { CustomizeView } from '../views/CustomizeView.js';
import { HelpView } from '../views/HelpView.js';
import { HistoryView } from '../views/HistoryView.js';
import { AssistantView } from '../views/AssistantView.js';
import { OnboardingView } from '../views/OnboardingView.js';
export class CheatingDaddyApp extends LitElement {
static styles = css`
* {
box-sizing: border-box;
font-family: 'Inter', -apple-system, BlinkMacSystemFont, sans-serif;
margin: 0px;
padding: 0px;
cursor: default;
user-select: none;
}
:host {
display: block;
width: 100%;
height: 100vh;
background-color: var(--background-transparent);
color: var(--text-color);
}
.window-container {
height: 100vh;
overflow: hidden;
background: var(--bg-primary);
}
.container {
display: flex;
flex-direction: column;
height: 100%;
}
.main-content {
flex: 1;
padding: var(--main-content-padding);
overflow-y: auto;
background: var(--main-content-background);
}
.main-content.with-border {
border-top: none;
}
.main-content.assistant-view {
padding: 12px;
}
.main-content.onboarding-view {
padding: 0;
background: transparent;
}
.main-content.settings-view,
.main-content.help-view,
.main-content.history-view {
padding: 0;
}
.view-container {
opacity: 1;
height: 100%;
}
.view-container.entering {
opacity: 0;
}
::-webkit-scrollbar {
width: 8px;
height: 8px;
}
::-webkit-scrollbar-track {
background: transparent;
}
::-webkit-scrollbar-thumb {
background: var(--scrollbar-thumb);
border-radius: 4px;
}
::-webkit-scrollbar-thumb:hover {
background: var(--scrollbar-thumb-hover);
}
`;
static properties = {
currentView: { type: String },
statusText: { type: String },
startTime: { type: Number },
isRecording: { type: Boolean },
sessionActive: { type: Boolean },
selectedProfile: { type: String },
selectedLanguage: { type: String },
responses: { type: Array },
currentResponseIndex: { type: Number },
selectedScreenshotInterval: { type: String },
selectedImageQuality: { type: String },
layoutMode: { type: String },
_viewInstances: { type: Object, state: true },
_isClickThrough: { state: true },
_awaitingNewResponse: { state: true },
shouldAnimateResponse: { type: Boolean },
_storageLoaded: { state: true },
};
constructor() {
super();
// Set defaults - will be overwritten by storage
this.currentView = 'main'; // Will check onboarding after storage loads
this.statusText = '';
this.startTime = null;
this.isRecording = false;
this.sessionActive = false;
this.selectedProfile = 'interview';
this.selectedLanguage = 'en-US';
this.selectedScreenshotInterval = '5';
this.selectedImageQuality = 'medium';
this.layoutMode = 'normal';
this.responses = [];
this.currentResponseIndex = -1;
this._viewInstances = new Map();
this._isClickThrough = false;
this._awaitingNewResponse = false;
this._currentResponseIsComplete = true;
this.shouldAnimateResponse = false;
this._storageLoaded = false;
// Load from storage
this._loadFromStorage();
}
async _loadFromStorage() {
try {
const [config, prefs] = await Promise.all([
cheatingDaddy.storage.getConfig(),
cheatingDaddy.storage.getPreferences()
]);
// Check onboarding status
this.currentView = config.onboarded ? 'main' : 'onboarding';
// Apply background appearance (color + transparency)
this.applyBackgroundAppearance(
prefs.backgroundColor ?? '#1e1e1e',
prefs.backgroundTransparency ?? 0.8
);
// Load preferences
this.selectedProfile = prefs.selectedProfile || 'interview';
this.selectedLanguage = prefs.selectedLanguage || 'en-US';
this.selectedScreenshotInterval = prefs.selectedScreenshotInterval || '5';
this.selectedImageQuality = prefs.selectedImageQuality || 'medium';
this.layoutMode = config.layout || 'normal';
this._storageLoaded = true;
this.updateLayoutMode();
this.requestUpdate();
} catch (error) {
console.error('Error loading from storage:', error);
this._storageLoaded = true;
this.requestUpdate();
}
}
hexToRgb(hex) {
const result = /^#?([a-f\d]{2})([a-f\d]{2})([a-f\d]{2})$/i.exec(hex);
return result ? {
r: parseInt(result[1], 16),
g: parseInt(result[2], 16),
b: parseInt(result[3], 16)
} : { r: 30, g: 30, b: 30 };
}
lightenColor(rgb, amount) {
return {
r: Math.min(255, rgb.r + amount),
g: Math.min(255, rgb.g + amount),
b: Math.min(255, rgb.b + amount)
};
}
applyBackgroundAppearance(backgroundColor, alpha) {
const root = document.documentElement;
const baseRgb = this.hexToRgb(backgroundColor);
// Generate color variants based on the base color
const secondary = this.lightenColor(baseRgb, 7);
const tertiary = this.lightenColor(baseRgb, 15);
const hover = this.lightenColor(baseRgb, 20);
root.style.setProperty('--header-background', `rgba(${baseRgb.r}, ${baseRgb.g}, ${baseRgb.b}, ${alpha})`);
root.style.setProperty('--main-content-background', `rgba(${baseRgb.r}, ${baseRgb.g}, ${baseRgb.b}, ${alpha})`);
root.style.setProperty('--bg-primary', `rgba(${baseRgb.r}, ${baseRgb.g}, ${baseRgb.b}, ${alpha})`);
root.style.setProperty('--bg-secondary', `rgba(${secondary.r}, ${secondary.g}, ${secondary.b}, ${alpha})`);
root.style.setProperty('--bg-tertiary', `rgba(${tertiary.r}, ${tertiary.g}, ${tertiary.b}, ${alpha})`);
root.style.setProperty('--bg-hover', `rgba(${hover.r}, ${hover.g}, ${hover.b}, ${alpha})`);
root.style.setProperty('--input-background', `rgba(${tertiary.r}, ${tertiary.g}, ${tertiary.b}, ${alpha})`);
root.style.setProperty('--input-focus-background', `rgba(${tertiary.r}, ${tertiary.g}, ${tertiary.b}, ${alpha})`);
root.style.setProperty('--hover-background', `rgba(${hover.r}, ${hover.g}, ${hover.b}, ${alpha})`);
root.style.setProperty('--scrollbar-background', `rgba(${baseRgb.r}, ${baseRgb.g}, ${baseRgb.b}, ${alpha})`);
}
// Keep old function name for backwards compatibility
applyBackgroundTransparency(alpha) {
this.applyBackgroundAppearance('#1e1e1e', alpha);
}
connectedCallback() {
super.connectedCallback();
// Apply layout mode to document root
this.updateLayoutMode();
// Set up IPC listeners if needed
if (window.require) {
const { ipcRenderer } = window.require('electron');
ipcRenderer.on('new-response', (_, response) => {
this.addNewResponse(response);
});
ipcRenderer.on('update-response', (_, response) => {
this.updateCurrentResponse(response);
});
ipcRenderer.on('update-status', (_, status) => {
this.setStatus(status);
});
ipcRenderer.on('click-through-toggled', (_, isEnabled) => {
this._isClickThrough = isEnabled;
});
ipcRenderer.on('reconnect-failed', (_, data) => {
this.addNewResponse(data.message);
});
}
}
disconnectedCallback() {
super.disconnectedCallback();
if (window.require) {
const { ipcRenderer } = window.require('electron');
ipcRenderer.removeAllListeners('new-response');
ipcRenderer.removeAllListeners('update-response');
ipcRenderer.removeAllListeners('update-status');
ipcRenderer.removeAllListeners('click-through-toggled');
ipcRenderer.removeAllListeners('reconnect-failed');
}
}
setStatus(text) {
this.statusText = text;
// Mark response as complete when we get certain status messages
if (text.includes('Ready') || text.includes('Listening') || text.includes('Error')) {
this._currentResponseIsComplete = true;
console.log('[setStatus] Marked current response as complete');
}
}
addNewResponse(response) {
// Add a new response entry (first word of a new AI response)
this.responses = [...this.responses, response];
this.currentResponseIndex = this.responses.length - 1;
this._awaitingNewResponse = false;
console.log('[addNewResponse] Added:', response);
this.requestUpdate();
}
updateCurrentResponse(response) {
// Update the current response in place (streaming subsequent words)
if (this.responses.length > 0) {
this.responses = [...this.responses.slice(0, -1), response];
console.log('[updateCurrentResponse] Updated to:', response);
} else {
// Fallback: if no responses exist, add as new
this.addNewResponse(response);
}
this.requestUpdate();
}
// Header event handlers
handleCustomizeClick() {
this.currentView = 'customize';
this.requestUpdate();
}
handleHelpClick() {
this.currentView = 'help';
this.requestUpdate();
}
handleHistoryClick() {
this.currentView = 'history';
this.requestUpdate();
}
async handleClose() {
if (this.currentView === 'customize' || this.currentView === 'help' || this.currentView === 'history') {
this.currentView = 'main';
} else if (this.currentView === 'assistant') {
cheatingDaddy.stopCapture();
// Close the session
if (window.require) {
const { ipcRenderer } = window.require('electron');
await ipcRenderer.invoke('close-session');
}
this.sessionActive = false;
this.currentView = 'main';
console.log('Session closed');
} else {
// Quit the entire application
if (window.require) {
const { ipcRenderer } = window.require('electron');
await ipcRenderer.invoke('quit-application');
}
}
}
async handleHideToggle() {
if (window.require) {
const { ipcRenderer } = window.require('electron');
await ipcRenderer.invoke('toggle-window-visibility');
}
}
// Main view event handlers
async handleStart() {
// check if api key is empty do nothing
const apiKey = await cheatingDaddy.storage.getApiKey();
if (!apiKey || apiKey === '') {
// Trigger the red blink animation on the API key input
const mainView = this.shadowRoot.querySelector('main-view');
if (mainView && mainView.triggerApiKeyError) {
mainView.triggerApiKeyError();
}
return;
}
await cheatingDaddy.initializeGemini(this.selectedProfile, this.selectedLanguage);
// Pass the screenshot interval as string (including 'manual' option)
cheatingDaddy.startCapture(this.selectedScreenshotInterval, this.selectedImageQuality);
this.responses = [];
this.currentResponseIndex = -1;
this.startTime = Date.now();
this.currentView = 'assistant';
}
async handleAPIKeyHelp() {
if (window.require) {
const { ipcRenderer } = window.require('electron');
await ipcRenderer.invoke('open-external', 'https://cheatingdaddy.com/help/api-key');
}
}
// Customize view event handlers
async handleProfileChange(profile) {
this.selectedProfile = profile;
await cheatingDaddy.storage.updatePreference('selectedProfile', profile);
}
async handleLanguageChange(language) {
this.selectedLanguage = language;
await cheatingDaddy.storage.updatePreference('selectedLanguage', language);
}
async handleScreenshotIntervalChange(interval) {
this.selectedScreenshotInterval = interval;
await cheatingDaddy.storage.updatePreference('selectedScreenshotInterval', interval);
}
async handleImageQualityChange(quality) {
this.selectedImageQuality = quality;
await cheatingDaddy.storage.updatePreference('selectedImageQuality', quality);
}
handleBackClick() {
this.currentView = 'main';
this.requestUpdate();
}
// Help view event handlers
async handleExternalLinkClick(url) {
if (window.require) {
const { ipcRenderer } = window.require('electron');
await ipcRenderer.invoke('open-external', url);
}
}
// Assistant view event handlers
async handleSendText(message) {
const result = await window.cheatingDaddy.sendTextMessage(message);
if (!result.success) {
console.error('Failed to send message:', result.error);
this.setStatus('Error sending message: ' + result.error);
} else {
this.setStatus('Message sent...');
this._awaitingNewResponse = true;
}
}
handleResponseIndexChanged(e) {
this.currentResponseIndex = e.detail.index;
this.shouldAnimateResponse = false;
this.requestUpdate();
}
// Onboarding event handlers
handleOnboardingComplete() {
this.currentView = 'main';
}
updated(changedProperties) {
super.updated(changedProperties);
// Only notify main process of view change if the view actually changed
if (changedProperties.has('currentView') && window.require) {
const { ipcRenderer } = window.require('electron');
ipcRenderer.send('view-changed', this.currentView);
// Add a small delay to smooth out the transition
const viewContainer = this.shadowRoot?.querySelector('.view-container');
if (viewContainer) {
viewContainer.classList.add('entering');
requestAnimationFrame(() => {
viewContainer.classList.remove('entering');
});
}
}
if (changedProperties.has('layoutMode')) {
this.updateLayoutMode();
}
}
renderCurrentView() {
// Only re-render the view if it hasn't been cached or if critical properties changed
const viewKey = `${this.currentView}-${this.selectedProfile}-${this.selectedLanguage}`;
switch (this.currentView) {
case 'onboarding':
return html`
<onboarding-view .onComplete=${() => this.handleOnboardingComplete()} .onClose=${() => this.handleClose()}></onboarding-view>
`;
case 'main':
return html`
<main-view
.onStart=${() => this.handleStart()}
.onAPIKeyHelp=${() => this.handleAPIKeyHelp()}
.onLayoutModeChange=${layoutMode => this.handleLayoutModeChange(layoutMode)}
></main-view>
`;
case 'customize':
return html`
<customize-view
.selectedProfile=${this.selectedProfile}
.selectedLanguage=${this.selectedLanguage}
.selectedScreenshotInterval=${this.selectedScreenshotInterval}
.selectedImageQuality=${this.selectedImageQuality}
.layoutMode=${this.layoutMode}
.onProfileChange=${profile => this.handleProfileChange(profile)}
.onLanguageChange=${language => this.handleLanguageChange(language)}
.onScreenshotIntervalChange=${interval => this.handleScreenshotIntervalChange(interval)}
.onImageQualityChange=${quality => this.handleImageQualityChange(quality)}
.onLayoutModeChange=${layoutMode => this.handleLayoutModeChange(layoutMode)}
></customize-view>
`;
case 'help':
return html` <help-view .onExternalLinkClick=${url => this.handleExternalLinkClick(url)}></help-view> `;
case 'history':
return html` <history-view></history-view> `;
case 'assistant':
return html`
<assistant-view
.responses=${this.responses}
.currentResponseIndex=${this.currentResponseIndex}
.selectedProfile=${this.selectedProfile}
.onSendText=${message => this.handleSendText(message)}
.shouldAnimateResponse=${this.shouldAnimateResponse}
@response-index-changed=${this.handleResponseIndexChanged}
@response-animation-complete=${() => {
this.shouldAnimateResponse = false;
this._currentResponseIsComplete = true;
console.log('[response-animation-complete] Marked current response as complete');
this.requestUpdate();
}}
></assistant-view>
`;
default:
return html`<div>Unknown view: ${this.currentView}</div>`;
}
}
render() {
const viewClassMap = {
'assistant': 'assistant-view',
'onboarding': 'onboarding-view',
'customize': 'settings-view',
'help': 'help-view',
'history': 'history-view',
};
const mainContentClass = `main-content ${viewClassMap[this.currentView] || 'with-border'}`;
return html`
<div class="window-container">
<div class="container">
<app-header
.currentView=${this.currentView}
.statusText=${this.statusText}
.startTime=${this.startTime}
.onCustomizeClick=${() => this.handleCustomizeClick()}
.onHelpClick=${() => this.handleHelpClick()}
.onHistoryClick=${() => this.handleHistoryClick()}
.onCloseClick=${() => this.handleClose()}
.onBackClick=${() => this.handleBackClick()}
.onHideToggleClick=${() => this.handleHideToggle()}
?isClickThrough=${this._isClickThrough}
></app-header>
<div class="${mainContentClass}">
<div class="view-container">${this.renderCurrentView()}</div>
</div>
</div>
</div>
`;
}
updateLayoutMode() {
// Apply or remove compact layout class to document root
if (this.layoutMode === 'compact') {
document.documentElement.classList.add('compact-layout');
} else {
document.documentElement.classList.remove('compact-layout');
}
}
async handleLayoutModeChange(layoutMode) {
this.layoutMode = layoutMode;
await cheatingDaddy.storage.updateConfig('layout', layoutMode);
this.updateLayoutMode();
// Notify main process about layout change for window resizing
if (window.require) {
try {
const { ipcRenderer } = window.require('electron');
await ipcRenderer.invoke('update-sizes');
} catch (error) {
console.error('Failed to update sizes in main process:', error);
}
}
this.requestUpdate();
}
}
customElements.define('cheating-daddy-app', CheatingDaddyApp);

12
src/components/index.js Normal file
View File

@ -0,0 +1,12 @@
// Main app components
export { CheatingDaddyApp } from './app/CheatingDaddyApp.js';
export { AppHeader } from './app/AppHeader.js';
// View components
export { MainView } from './views/MainView.js';
export { CustomizeView } from './views/CustomizeView.js';
export { HelpView } from './views/HelpView.js';
export { HistoryView } from './views/HistoryView.js';
export { AssistantView } from './views/AssistantView.js';
export { OnboardingView } from './views/OnboardingView.js';
export { AdvancedView } from './views/AdvancedView.js';

View File

@ -0,0 +1,636 @@
import { html, css, LitElement } from '../../assets/lit-core-2.7.4.min.js';
export class AssistantView extends LitElement {
static styles = css`
:host {
height: 100%;
display: flex;
flex-direction: column;
}
* {
font-family: 'Inter', -apple-system, BlinkMacSystemFont, sans-serif;
cursor: default;
}
.response-container {
height: calc(100% - 50px);
overflow-y: auto;
font-size: var(--response-font-size, 16px);
line-height: 1.6;
background: var(--bg-primary);
padding: 12px;
scroll-behavior: smooth;
user-select: text;
cursor: text;
}
.response-container * {
user-select: text;
cursor: text;
}
.response-container a {
cursor: pointer;
}
/* Word display (no animation) */
.response-container [data-word] {
display: inline-block;
}
/* Markdown styling */
.response-container h1,
.response-container h2,
.response-container h3,
.response-container h4,
.response-container h5,
.response-container h6 {
margin: 1em 0 0.5em 0;
color: var(--text-color);
font-weight: 600;
}
.response-container h1 { font-size: 1.6em; }
.response-container h2 { font-size: 1.4em; }
.response-container h3 { font-size: 1.2em; }
.response-container h4 { font-size: 1.1em; }
.response-container h5 { font-size: 1em; }
.response-container h6 { font-size: 0.9em; }
.response-container p {
margin: 0.6em 0;
color: var(--text-color);
}
.response-container ul,
.response-container ol {
margin: 0.6em 0;
padding-left: 1.5em;
color: var(--text-color);
}
.response-container li {
margin: 0.3em 0;
}
.response-container blockquote {
margin: 0.8em 0;
padding: 0.5em 1em;
border-left: 2px solid var(--border-default);
background: var(--bg-secondary);
}
.response-container code {
background: var(--bg-tertiary);
padding: 0.15em 0.4em;
border-radius: 3px;
font-family: 'SF Mono', Monaco, monospace;
font-size: 0.85em;
}
.response-container pre {
background: var(--bg-secondary);
border: 1px solid var(--border-color);
border-radius: 3px;
padding: 12px;
overflow-x: auto;
margin: 0.8em 0;
}
.response-container pre code {
background: none;
padding: 0;
}
.response-container a {
color: var(--text-color);
text-decoration: underline;
text-underline-offset: 2px;
}
.response-container strong,
.response-container b {
font-weight: 600;
}
.response-container hr {
border: none;
border-top: 1px solid var(--border-color);
margin: 1.5em 0;
}
.response-container table {
border-collapse: collapse;
width: 100%;
margin: 0.8em 0;
}
.response-container th,
.response-container td {
border: 1px solid var(--border-color);
padding: 8px;
text-align: left;
}
.response-container th {
background: var(--bg-secondary);
font-weight: 600;
}
.response-container::-webkit-scrollbar {
width: 8px;
}
.response-container::-webkit-scrollbar-track {
background: transparent;
}
.response-container::-webkit-scrollbar-thumb {
background: var(--scrollbar-thumb);
border-radius: 4px;
}
.response-container::-webkit-scrollbar-thumb:hover {
background: var(--scrollbar-thumb-hover);
}
.text-input-container {
display: flex;
gap: 8px;
margin-top: 8px;
align-items: center;
}
.text-input-container input {
flex: 1;
background: transparent;
color: var(--text-color);
border: none;
border-bottom: 1px solid var(--border-color);
padding: 8px 4px;
border-radius: 0;
font-size: 13px;
}
.text-input-container input:focus {
outline: none;
border-bottom-color: var(--text-color);
}
.text-input-container input::placeholder {
color: var(--placeholder-color);
}
.nav-button {
background: transparent;
color: var(--text-secondary);
border: none;
padding: 6px;
border-radius: 3px;
font-size: 12px;
display: flex;
align-items: center;
justify-content: center;
transition: all 0.1s ease;
}
.nav-button:hover {
background: var(--hover-background);
color: var(--text-color);
}
.nav-button:disabled {
opacity: 0.3;
}
.nav-button svg {
width: 18px;
height: 18px;
stroke: currentColor;
}
.response-counter {
font-size: 11px;
color: var(--text-muted);
white-space: nowrap;
min-width: 50px;
text-align: center;
font-family: 'SF Mono', Monaco, monospace;
}
.screen-answer-btn {
display: flex;
align-items: center;
gap: 6px;
background: var(--btn-primary-bg, #ffffff);
color: var(--btn-primary-text, #000000);
border: none;
padding: 6px 12px;
border-radius: 20px;
font-size: 12px;
font-weight: 500;
cursor: pointer;
transition: all 0.15s ease;
white-space: nowrap;
}
.screen-answer-btn:hover {
background: var(--btn-primary-hover, #f0f0f0);
}
.screen-answer-btn svg {
width: 16px;
height: 16px;
flex-shrink: 0;
}
.screen-answer-btn .usage-count {
font-size: 11px;
opacity: 0.7;
font-family: 'SF Mono', Monaco, monospace;
}
.screen-answer-btn-wrapper {
position: relative;
}
.screen-answer-btn-wrapper .tooltip {
position: absolute;
bottom: 100%;
right: 0;
margin-bottom: 8px;
background: var(--tooltip-bg, #1a1a1a);
color: var(--tooltip-text, #ffffff);
padding: 8px 12px;
border-radius: 6px;
font-size: 11px;
white-space: nowrap;
opacity: 0;
visibility: hidden;
transition: opacity 0.15s ease, visibility 0.15s ease;
pointer-events: none;
box-shadow: 0 4px 12px rgba(0,0,0,0.3);
z-index: 100;
}
.screen-answer-btn-wrapper .tooltip::after {
content: '';
position: absolute;
top: 100%;
right: 16px;
border: 6px solid transparent;
border-top-color: var(--tooltip-bg, #1a1a1a);
}
.screen-answer-btn-wrapper:hover .tooltip {
opacity: 1;
visibility: visible;
}
.tooltip-row {
display: flex;
justify-content: space-between;
gap: 16px;
margin-bottom: 4px;
}
.tooltip-row:last-child {
margin-bottom: 0;
}
.tooltip-label {
opacity: 0.7;
}
.tooltip-value {
font-family: 'SF Mono', Monaco, monospace;
}
.tooltip-note {
margin-top: 6px;
padding-top: 6px;
border-top: 1px solid rgba(255,255,255,0.1);
opacity: 0.5;
font-size: 10px;
}
`;
static properties = {
responses: { type: Array },
currentResponseIndex: { type: Number },
selectedProfile: { type: String },
onSendText: { type: Function },
shouldAnimateResponse: { type: Boolean },
flashCount: { type: Number },
flashLiteCount: { type: Number },
};
constructor() {
super();
this.responses = [];
this.currentResponseIndex = -1;
this.selectedProfile = 'interview';
this.onSendText = () => {};
this.flashCount = 0;
this.flashLiteCount = 0;
}
getProfileNames() {
return {
interview: 'Job Interview',
sales: 'Sales Call',
meeting: 'Business Meeting',
presentation: 'Presentation',
negotiation: 'Negotiation',
exam: 'Exam Assistant',
};
}
getCurrentResponse() {
const profileNames = this.getProfileNames();
return this.responses.length > 0 && this.currentResponseIndex >= 0
? this.responses[this.currentResponseIndex]
: `Hey, Im listening to your ${profileNames[this.selectedProfile] || 'session'}?`;
}
renderMarkdown(content) {
// Check if marked is available
if (typeof window !== 'undefined' && window.marked) {
try {
// Configure marked for better security and formatting
window.marked.setOptions({
breaks: true,
gfm: true,
sanitize: false, // We trust the AI responses
});
let rendered = window.marked.parse(content);
rendered = this.wrapWordsInSpans(rendered);
return rendered;
} catch (error) {
console.warn('Error parsing markdown:', error);
return content; // Fallback to plain text
}
}
console.log('Marked not available, using plain text');
return content; // Fallback if marked is not available
}
wrapWordsInSpans(html) {
const parser = new DOMParser();
const doc = parser.parseFromString(html, 'text/html');
const tagsToSkip = ['PRE'];
function wrap(node) {
if (node.nodeType === Node.TEXT_NODE && node.textContent.trim() && !tagsToSkip.includes(node.parentNode.tagName)) {
const words = node.textContent.split(/(\s+)/);
const frag = document.createDocumentFragment();
words.forEach(word => {
if (word.trim()) {
const span = document.createElement('span');
span.setAttribute('data-word', '');
span.textContent = word;
frag.appendChild(span);
} else {
frag.appendChild(document.createTextNode(word));
}
});
node.parentNode.replaceChild(frag, node);
} else if (node.nodeType === Node.ELEMENT_NODE && !tagsToSkip.includes(node.tagName)) {
Array.from(node.childNodes).forEach(wrap);
}
}
Array.from(doc.body.childNodes).forEach(wrap);
return doc.body.innerHTML;
}
getResponseCounter() {
return this.responses.length > 0 ? `${this.currentResponseIndex + 1}/${this.responses.length}` : '';
}
navigateToPreviousResponse() {
if (this.currentResponseIndex > 0) {
this.currentResponseIndex--;
this.dispatchEvent(
new CustomEvent('response-index-changed', {
detail: { index: this.currentResponseIndex },
})
);
this.requestUpdate();
}
}
navigateToNextResponse() {
if (this.currentResponseIndex < this.responses.length - 1) {
this.currentResponseIndex++;
this.dispatchEvent(
new CustomEvent('response-index-changed', {
detail: { index: this.currentResponseIndex },
})
);
this.requestUpdate();
}
}
scrollResponseUp() {
const container = this.shadowRoot.querySelector('.response-container');
if (container) {
const scrollAmount = container.clientHeight * 0.3; // Scroll 30% of container height
container.scrollTop = Math.max(0, container.scrollTop - scrollAmount);
}
}
scrollResponseDown() {
const container = this.shadowRoot.querySelector('.response-container');
if (container) {
const scrollAmount = container.clientHeight * 0.3; // Scroll 30% of container height
container.scrollTop = Math.min(container.scrollHeight - container.clientHeight, container.scrollTop + scrollAmount);
}
}
connectedCallback() {
super.connectedCallback();
// Load limits on mount
this.loadLimits();
// Set up IPC listeners for keyboard shortcuts
if (window.require) {
const { ipcRenderer } = window.require('electron');
this.handlePreviousResponse = () => {
console.log('Received navigate-previous-response message');
this.navigateToPreviousResponse();
};
this.handleNextResponse = () => {
console.log('Received navigate-next-response message');
this.navigateToNextResponse();
};
this.handleScrollUp = () => {
console.log('Received scroll-response-up message');
this.scrollResponseUp();
};
this.handleScrollDown = () => {
console.log('Received scroll-response-down message');
this.scrollResponseDown();
};
ipcRenderer.on('navigate-previous-response', this.handlePreviousResponse);
ipcRenderer.on('navigate-next-response', this.handleNextResponse);
ipcRenderer.on('scroll-response-up', this.handleScrollUp);
ipcRenderer.on('scroll-response-down', this.handleScrollDown);
}
}
disconnectedCallback() {
super.disconnectedCallback();
// Clean up IPC listeners
if (window.require) {
const { ipcRenderer } = window.require('electron');
if (this.handlePreviousResponse) {
ipcRenderer.removeListener('navigate-previous-response', this.handlePreviousResponse);
}
if (this.handleNextResponse) {
ipcRenderer.removeListener('navigate-next-response', this.handleNextResponse);
}
if (this.handleScrollUp) {
ipcRenderer.removeListener('scroll-response-up', this.handleScrollUp);
}
if (this.handleScrollDown) {
ipcRenderer.removeListener('scroll-response-down', this.handleScrollDown);
}
}
}
async handleSendText() {
const textInput = this.shadowRoot.querySelector('#textInput');
if (textInput && textInput.value.trim()) {
const message = textInput.value.trim();
textInput.value = ''; // Clear input
await this.onSendText(message);
}
}
handleTextKeydown(e) {
if (e.key === 'Enter' && !e.shiftKey) {
e.preventDefault();
this.handleSendText();
}
}
async loadLimits() {
if (window.cheatingDaddy?.storage?.getTodayLimits) {
const limits = await window.cheatingDaddy.storage.getTodayLimits();
this.flashCount = limits.flash?.count || 0;
this.flashLiteCount = limits.flashLite?.count || 0;
}
}
getTotalUsed() {
return this.flashCount + this.flashLiteCount;
}
getTotalAvailable() {
return 40; // 20 flash + 20 flash-lite
}
async handleScreenAnswer() {
if (window.captureManualScreenshot) {
window.captureManualScreenshot();
// Reload limits after a short delay to catch the update
setTimeout(() => this.loadLimits(), 1000);
}
}
scrollToBottom() {
setTimeout(() => {
const container = this.shadowRoot.querySelector('.response-container');
if (container) {
container.scrollTop = container.scrollHeight;
}
}, 0);
}
firstUpdated() {
super.firstUpdated();
this.updateResponseContent();
}
updated(changedProperties) {
super.updated(changedProperties);
if (changedProperties.has('responses') || changedProperties.has('currentResponseIndex')) {
this.updateResponseContent();
}
}
updateResponseContent() {
console.log('updateResponseContent called');
const container = this.shadowRoot.querySelector('#responseContainer');
if (container) {
const currentResponse = this.getCurrentResponse();
console.log('Current response:', currentResponse);
const renderedResponse = this.renderMarkdown(currentResponse);
console.log('Rendered response:', renderedResponse);
container.innerHTML = renderedResponse;
// Show all words immediately (no animation)
if (this.shouldAnimateResponse) {
this.dispatchEvent(new CustomEvent('response-animation-complete', { bubbles: true, composed: true }));
}
} else {
console.log('Response container not found');
}
}
render() {
const responseCounter = this.getResponseCounter();
return html`
<div class="response-container" id="responseContainer"></div>
<div class="text-input-container">
<button class="nav-button" @click=${this.navigateToPreviousResponse} ?disabled=${this.currentResponseIndex <= 0}>
<svg width="24px" height="24px" stroke-width="1.7" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M15 6L9 12L15 18" stroke="currentColor" stroke-width="1.7" stroke-linecap="round" stroke-linejoin="round"></path>
</svg>
</button>
${this.responses.length > 0 ? html`<span class="response-counter">${responseCounter}</span>` : ''}
<button class="nav-button" @click=${this.navigateToNextResponse} ?disabled=${this.currentResponseIndex >= this.responses.length - 1}>
<svg width="24px" height="24px" stroke-width="1.7" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M9 6L15 12L9 18" stroke="currentColor" stroke-width="1.7" stroke-linecap="round" stroke-linejoin="round"></path>
</svg>
</button>
<input type="text" id="textInput" placeholder="Type a message to the AI..." @keydown=${this.handleTextKeydown} />
<div class="screen-answer-btn-wrapper">
<div class="tooltip">
<div class="tooltip-row">
<span class="tooltip-label">Flash</span>
<span class="tooltip-value">${this.flashCount}/20</span>
</div>
<div class="tooltip-row">
<span class="tooltip-label">Flash Lite</span>
<span class="tooltip-value">${this.flashLiteCount}/20</span>
</div>
<div class="tooltip-note">Resets every 24 hours</div>
</div>
<button class="screen-answer-btn" @click=${this.handleScreenAnswer}>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor">
<path d="M15.98 1.804a1 1 0 0 0-1.96 0l-.24 1.192a1 1 0 0 1-.784.785l-1.192.238a1 1 0 0 0 0 1.962l1.192.238a1 1 0 0 1 .785.785l.238 1.192a1 1 0 0 0 1.962 0l.238-1.192a1 1 0 0 1 .785-.785l1.192-.238a1 1 0 0 0 0-1.962l-1.192-.238a1 1 0 0 1-.785-.785l-.238-1.192ZM6.949 5.684a1 1 0 0 0-1.898 0l-.683 2.051a1 1 0 0 1-.633.633l-2.051.683a1 1 0 0 0 0 1.898l2.051.684a1 1 0 0 1 .633.632l.683 2.051a1 1 0 0 0 1.898 0l.683-2.051a1 1 0 0 1 .633-.633l2.051-.683a1 1 0 0 0 0-1.898l-2.051-.683a1 1 0 0 1-.633-.633L6.95 5.684ZM13.949 13.684a1 1 0 0 0-1.898 0l-.184.551a1 1 0 0 1-.632.633l-.551.183a1 1 0 0 0 0 1.898l.551.183a1 1 0 0 1 .633.633l.183.551a1 1 0 0 0 1.898 0l.184-.551a1 1 0 0 1 .632-.633l.551-.183a1 1 0 0 0 0-1.898l-.551-.184a1 1 0 0 1-.633-.632l-.183-.551Z" />
</svg>
<span>Analyze screen</span>
<span class="usage-count">(${this.getTotalUsed()}/${this.getTotalAvailable()})</span>
</button>
</div>
</div>
`;
}
}
customElements.define('assistant-view', AssistantView);

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,450 @@
import { html, css, LitElement } from '../../assets/lit-core-2.7.4.min.js';
import { resizeLayout } from '../../utils/windowResize.js';
export class HelpView extends LitElement {
static styles = css`
* {
font-family: 'Inter', -apple-system, BlinkMacSystemFont, sans-serif;
cursor: default;
user-select: none;
}
:host {
display: block;
padding: 0;
}
.help-container {
display: flex;
flex-direction: column;
}
.option-group {
padding: 16px 12px;
border-bottom: 1px solid var(--border-color);
}
.option-group:last-child {
border-bottom: none;
}
.option-label {
font-size: 11px;
font-weight: 600;
color: var(--text-muted);
text-transform: uppercase;
letter-spacing: 0.5px;
margin-bottom: 12px;
}
.description {
color: var(--text-secondary);
font-size: 12px;
line-height: 1.4;
user-select: text;
cursor: text;
}
.description strong {
color: var(--text-color);
font-weight: 500;
}
.link {
color: var(--text-color);
text-decoration: underline;
text-underline-offset: 2px;
cursor: pointer;
}
.key {
background: var(--bg-tertiary);
color: var(--text-color);
border: 1px solid var(--border-color);
padding: 2px 6px;
border-radius: 3px;
font-size: 10px;
font-family: 'SF Mono', Monaco, monospace;
font-weight: 500;
margin: 0 1px;
white-space: nowrap;
}
.keyboard-section {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
gap: 12px;
margin-top: 8px;
}
.keyboard-group {
padding: 10px 0;
border-bottom: 1px solid var(--border-color);
}
.keyboard-group:last-child {
border-bottom: none;
}
.keyboard-group-title {
font-weight: 600;
font-size: 12px;
color: var(--text-color);
margin-bottom: 8px;
}
.shortcut-item {
display: flex;
justify-content: space-between;
align-items: center;
padding: 4px 0;
font-size: 11px;
}
.shortcut-description {
color: var(--text-secondary);
}
.shortcut-keys {
display: flex;
gap: 2px;
}
.profiles-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(180px, 1fr));
gap: 8px;
margin-top: 8px;
}
.profile-item {
padding: 8px 0;
border-bottom: 1px solid var(--border-color);
}
.profile-item:last-child {
border-bottom: none;
}
.profile-name {
font-weight: 500;
font-size: 12px;
color: var(--text-color);
margin-bottom: 2px;
}
.profile-description {
font-size: 11px;
color: var(--text-muted);
line-height: 1.3;
}
.community-links {
display: flex;
gap: 8px;
flex-wrap: wrap;
}
.community-link {
display: flex;
align-items: center;
gap: 6px;
padding: 6px 10px;
background: transparent;
border: 1px solid var(--border-color);
border-radius: 3px;
color: var(--text-color);
font-size: 11px;
font-weight: 500;
transition: background 0.1s ease;
cursor: pointer;
}
.community-link:hover {
background: var(--hover-background);
}
.community-link svg {
width: 14px;
height: 14px;
flex-shrink: 0;
}
.usage-steps {
counter-reset: step-counter;
}
.usage-step {
counter-increment: step-counter;
position: relative;
padding-left: 24px;
margin-bottom: 8px;
font-size: 11px;
line-height: 1.4;
color: var(--text-secondary);
}
.usage-step::before {
content: counter(step-counter);
position: absolute;
left: 0;
top: 0;
width: 16px;
height: 16px;
background: var(--bg-tertiary);
color: var(--text-color);
border-radius: 3px;
display: flex;
align-items: center;
justify-content: center;
font-size: 10px;
font-weight: 600;
}
.usage-step strong {
color: var(--text-color);
}
`;
static properties = {
onExternalLinkClick: { type: Function },
keybinds: { type: Object },
};
constructor() {
super();
this.onExternalLinkClick = () => {};
this.keybinds = this.getDefaultKeybinds();
this._loadKeybinds();
}
async _loadKeybinds() {
try {
const keybinds = await cheatingDaddy.storage.getKeybinds();
if (keybinds) {
this.keybinds = { ...this.getDefaultKeybinds(), ...keybinds };
this.requestUpdate();
}
} catch (error) {
console.error('Error loading keybinds:', error);
}
}
connectedCallback() {
super.connectedCallback();
// Resize window for this view
resizeLayout();
}
getDefaultKeybinds() {
const isMac = cheatingDaddy.isMacOS || navigator.platform.includes('Mac');
return {
moveUp: isMac ? 'Alt+Up' : 'Ctrl+Up',
moveDown: isMac ? 'Alt+Down' : 'Ctrl+Down',
moveLeft: isMac ? 'Alt+Left' : 'Ctrl+Left',
moveRight: isMac ? 'Alt+Right' : 'Ctrl+Right',
toggleVisibility: isMac ? 'Cmd+\\' : 'Ctrl+\\',
toggleClickThrough: isMac ? 'Cmd+M' : 'Ctrl+M',
nextStep: isMac ? 'Cmd+Enter' : 'Ctrl+Enter',
previousResponse: isMac ? 'Cmd+[' : 'Ctrl+[',
nextResponse: isMac ? 'Cmd+]' : 'Ctrl+]',
scrollUp: isMac ? 'Cmd+Shift+Up' : 'Ctrl+Shift+Up',
scrollDown: isMac ? 'Cmd+Shift+Down' : 'Ctrl+Shift+Down',
};
}
formatKeybind(keybind) {
return keybind.split('+').map(key => html`<span class="key">${key}</span>`);
}
handleExternalLinkClick(url) {
this.onExternalLinkClick(url);
}
render() {
const isMacOS = cheatingDaddy.isMacOS || false;
const isLinux = cheatingDaddy.isLinux || false;
return html`
<div class="help-container">
<div class="option-group">
<div class="option-label">
<span>Community & Support</span>
</div>
<div class="community-links">
<div class="community-link" @click=${() => this.handleExternalLinkClick('https://cheatingdaddy.com')}>
<svg viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<path d="M14 11.9976C14 9.5059 11.683 7 8.85714 7C8.52241 7 7.41904 7.00001 7.14286 7.00001C4.30254 7.00001 2 9.23752 2 11.9976C2 14.376 3.70973 16.3664 6 16.8714C6.36756 16.9525 6.75006 16.9952 7.14286 16.9952"></path>
<path d="M10 11.9976C10 14.4893 12.317 16.9952 15.1429 16.9952C15.4776 16.9952 16.581 16.9952 16.8571 16.9952C19.6975 16.9952 22 14.7577 22 11.9976C22 9.6192 20.2903 7.62884 18 7.12383C17.6324 7.04278 17.2499 6.99999 16.8571 6.99999"></path>
</svg>
Website
</div>
<div class="community-link" @click=${() => this.handleExternalLinkClick('https://github.com/sohzm/cheating-daddy')}>
<svg viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<path d="M16 22.0268V19.1568C16.0375 18.68 15.9731 18.2006 15.811 17.7506C15.6489 17.3006 15.3929 16.8902 15.06 16.5468C18.2 16.1968 21.5 15.0068 21.5 9.54679C21.4997 8.15062 20.9627 6.80799 20 5.79679C20.4558 4.5753 20.4236 3.22514 19.91 2.02679C19.91 2.02679 18.73 1.67679 16 3.50679C13.708 2.88561 11.292 2.88561 8.99999 3.50679C6.26999 1.67679 5.08999 2.02679 5.08999 2.02679C4.57636 3.22514 4.54413 4.5753 4.99999 5.79679C4.03011 6.81549 3.49251 8.17026 3.49999 9.57679C3.49999 14.9968 6.79998 16.1868 9.93998 16.5768C9.61098 16.9168 9.35725 17.3222 9.19529 17.7667C9.03334 18.2112 8.96679 18.6849 8.99999 19.1568V22.0268"></path>
<path d="M9 20.0267C6 20.9999 3.5 20.0267 2 17.0267"></path>
</svg>
GitHub
</div>
<div class="community-link" @click=${() => this.handleExternalLinkClick('https://discord.gg/GCBdubnXfJ')}>
<svg viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<path d="M5.5 16C10.5 18.5 13.5 18.5 18.5 16"></path>
<path d="M15.5 17.5L16.5 19.5C16.5 19.5 20.6713 18.1717 22 16C22 15 22.5301 7.85339 19 5.5C17.5 4.5 15 4 15 4L14 6H12"></path>
<path d="M8.52832 17.5L7.52832 19.5C7.52832 19.5 3.35699 18.1717 2.02832 16C2.02832 15 1.49823 7.85339 5.02832 5.5C6.52832 4.5 9.02832 4 9.02832 4L10.0283 6H12.0283"></path>
<path d="M8.5 14C7.67157 14 7 13.1046 7 12C7 10.8954 7.67157 10 8.5 10C9.32843 10 10 10.8954 10 12C10 13.1046 9.32843 14 8.5 14Z"></path>
<path d="M15.5 14C14.6716 14 14 13.1046 14 12C14 10.8954 14.6716 10 15.5 10C16.3284 10 17 10.8954 17 12C17 13.1046 16.3284 14 15.5 14Z"></path>
</svg>
Discord
</div>
</div>
</div>
<div class="option-group">
<div class="option-label">
<span>Keyboard Shortcuts</span>
</div>
<div class="keyboard-section">
<div class="keyboard-group">
<div class="keyboard-group-title">Window Movement</div>
<div class="shortcut-item">
<span class="shortcut-description">Move window up</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.moveUp)}</div>
</div>
<div class="shortcut-item">
<span class="shortcut-description">Move window down</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.moveDown)}</div>
</div>
<div class="shortcut-item">
<span class="shortcut-description">Move window left</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.moveLeft)}</div>
</div>
<div class="shortcut-item">
<span class="shortcut-description">Move window right</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.moveRight)}</div>
</div>
</div>
<div class="keyboard-group">
<div class="keyboard-group-title">Window Control</div>
<div class="shortcut-item">
<span class="shortcut-description">Toggle click-through mode</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.toggleClickThrough)}</div>
</div>
<div class="shortcut-item">
<span class="shortcut-description">Toggle window visibility</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.toggleVisibility)}</div>
</div>
</div>
<div class="keyboard-group">
<div class="keyboard-group-title">AI Actions</div>
<div class="shortcut-item">
<span class="shortcut-description">Take screenshot and ask for next step</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.nextStep)}</div>
</div>
</div>
<div class="keyboard-group">
<div class="keyboard-group-title">Response Navigation</div>
<div class="shortcut-item">
<span class="shortcut-description">Previous response</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.previousResponse)}</div>
</div>
<div class="shortcut-item">
<span class="shortcut-description">Next response</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.nextResponse)}</div>
</div>
<div class="shortcut-item">
<span class="shortcut-description">Scroll response up</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.scrollUp)}</div>
</div>
<div class="shortcut-item">
<span class="shortcut-description">Scroll response down</span>
<div class="shortcut-keys">${this.formatKeybind(this.keybinds.scrollDown)}</div>
</div>
</div>
<div class="keyboard-group">
<div class="keyboard-group-title">Text Input</div>
<div class="shortcut-item">
<span class="shortcut-description">Send message to AI</span>
<div class="shortcut-keys"><span class="key">Enter</span></div>
</div>
<div class="shortcut-item">
<span class="shortcut-description">New line in text input</span>
<div class="shortcut-keys"><span class="key">Shift</span><span class="key">Enter</span></div>
</div>
</div>
</div>
<div class="description" style="margin-top: 12px; text-align: center;">
You can customize these shortcuts in Settings.
</div>
</div>
<div class="option-group">
<div class="option-label">
<span>How to Use</span>
</div>
<div class="usage-steps">
<div class="usage-step"><strong>Start a Session:</strong> Enter your Gemini API key and click "Start Session"</div>
<div class="usage-step"><strong>Customize:</strong> Choose your profile and language in the settings</div>
<div class="usage-step">
<strong>Position Window:</strong> Use keyboard shortcuts to move the window to your desired location
</div>
<div class="usage-step">
<strong>Click-through Mode:</strong> Use ${this.formatKeybind(this.keybinds.toggleClickThrough)} to make the window
click-through
</div>
<div class="usage-step"><strong>Get AI Help:</strong> The AI will analyze your screen and audio to provide assistance</div>
<div class="usage-step"><strong>Text Messages:</strong> Type questions or requests to the AI using the text input</div>
<div class="usage-step">
<strong>Navigate Responses:</strong> Use ${this.formatKeybind(this.keybinds.previousResponse)} and
${this.formatKeybind(this.keybinds.nextResponse)} to browse through AI responses
</div>
</div>
</div>
<div class="option-group">
<div class="option-label">
<span>Supported Profiles</span>
</div>
<div class="profiles-grid">
<div class="profile-item">
<div class="profile-name">Job Interview</div>
<div class="profile-description">Get help with interview questions and responses</div>
</div>
<div class="profile-item">
<div class="profile-name">Sales Call</div>
<div class="profile-description">Assistance with sales conversations and objection handling</div>
</div>
<div class="profile-item">
<div class="profile-name">Business Meeting</div>
<div class="profile-description">Support for professional meetings and discussions</div>
</div>
<div class="profile-item">
<div class="profile-name">Presentation</div>
<div class="profile-description">Help with presentations and public speaking</div>
</div>
<div class="profile-item">
<div class="profile-name">Negotiation</div>
<div class="profile-description">Guidance for business negotiations and deals</div>
</div>
<div class="profile-item">
<div class="profile-name">Exam Assistant</div>
<div class="profile-description">Academic assistance for test-taking and exam questions</div>
</div>
</div>
</div>
<div class="option-group">
<div class="option-label">
<span>Audio Input</span>
</div>
<div class="description">The AI listens to conversations and provides contextual assistance based on what it hears.</div>
</div>
</div>
`;
}
}
customElements.define('help-view', HelpView);

View File

@ -0,0 +1,649 @@
import { html, css, LitElement } from '../../assets/lit-core-2.7.4.min.js';
import { resizeLayout } from '../../utils/windowResize.js';
export class HistoryView extends LitElement {
static styles = css`
* {
font-family: 'Inter', -apple-system, BlinkMacSystemFont, sans-serif;
cursor: default;
user-select: none;
}
:host {
height: 100%;
display: flex;
flex-direction: column;
width: 100%;
}
.history-container {
height: 100%;
display: flex;
flex-direction: column;
}
.sessions-list {
flex: 1;
overflow-y: auto;
}
.session-item {
padding: 12px;
border-bottom: 1px solid var(--border-color);
cursor: pointer;
transition: background 0.1s ease;
}
.session-item:hover {
background: var(--hover-background);
}
.session-item.selected {
background: var(--bg-secondary);
}
.session-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 4px;
}
.session-date {
font-size: 12px;
font-weight: 500;
color: var(--text-color);
}
.session-time {
font-size: 11px;
color: var(--text-muted);
font-family: 'SF Mono', Monaco, monospace;
}
.session-preview {
font-size: 11px;
color: var(--text-muted);
line-height: 1.3;
}
.conversation-view {
flex: 1;
overflow-y: auto;
background: var(--bg-primary);
padding: 12px 0;
user-select: text;
cursor: text;
}
.message {
margin-bottom: 8px;
padding: 8px 12px;
border-left: 2px solid transparent;
font-size: 12px;
line-height: 1.4;
background: var(--bg-secondary);
user-select: text;
cursor: text;
white-space: pre-wrap;
word-wrap: break-word;
}
.message.user {
border-left-color: #3b82f6;
}
.message.ai {
border-left-color: #ef4444;
}
.back-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 12px;
padding: 12px 12px 12px 12px;
border-bottom: 1px solid var(--border-color);
}
.back-button {
background: transparent;
color: var(--text-color);
border: 1px solid var(--border-color);
padding: 6px 12px;
border-radius: 3px;
font-size: 11px;
font-weight: 500;
cursor: pointer;
display: flex;
align-items: center;
gap: 6px;
transition: background 0.1s ease;
}
.back-button:hover {
background: var(--hover-background);
}
.legend {
display: flex;
gap: 12px;
align-items: center;
}
.legend-item {
display: flex;
align-items: center;
gap: 4px;
font-size: 10px;
color: var(--text-muted);
}
.legend-dot {
width: 8px;
height: 2px;
}
.legend-dot.user {
background-color: #3b82f6;
}
.legend-dot.ai {
background-color: #ef4444;
}
.legend-dot.screen {
background-color: #22c55e;
}
.session-context {
padding: 8px 12px;
margin-bottom: 8px;
background: var(--bg-tertiary);
border-radius: 4px;
font-size: 11px;
}
.session-context-row {
display: flex;
gap: 8px;
margin-bottom: 4px;
}
.session-context-row:last-child {
margin-bottom: 0;
}
.context-label {
color: var(--text-muted);
min-width: 80px;
}
.context-value {
color: var(--text-color);
font-weight: 500;
}
.custom-prompt-value {
color: var(--text-secondary);
font-style: italic;
word-break: break-word;
white-space: pre-wrap;
}
.view-tabs {
display: flex;
gap: 0;
border-bottom: 1px solid var(--border-color);
margin-bottom: 8px;
}
.view-tab {
background: transparent;
color: var(--text-muted);
border: none;
padding: 8px 16px;
font-size: 11px;
font-weight: 500;
cursor: pointer;
border-bottom: 2px solid transparent;
margin-bottom: -1px;
transition: color 0.1s ease;
}
.view-tab:hover {
color: var(--text-color);
}
.view-tab.active {
color: var(--text-color);
border-bottom-color: var(--text-color);
}
.message.screen {
border-left-color: #22c55e;
}
.analysis-meta {
font-size: 10px;
color: var(--text-muted);
margin-bottom: 4px;
font-family: 'SF Mono', Monaco, monospace;
}
.empty-state {
text-align: center;
color: var(--text-muted);
font-size: 12px;
margin-top: 32px;
}
.empty-state-title {
font-size: 14px;
font-weight: 500;
margin-bottom: 6px;
color: var(--text-secondary);
}
.loading {
text-align: center;
color: var(--text-muted);
font-size: 12px;
margin-top: 32px;
}
.sessions-list::-webkit-scrollbar,
.conversation-view::-webkit-scrollbar {
width: 8px;
}
.sessions-list::-webkit-scrollbar-track,
.conversation-view::-webkit-scrollbar-track {
background: transparent;
}
.sessions-list::-webkit-scrollbar-thumb,
.conversation-view::-webkit-scrollbar-thumb {
background: var(--scrollbar-thumb);
border-radius: 4px;
}
.sessions-list::-webkit-scrollbar-thumb:hover,
.conversation-view::-webkit-scrollbar-thumb:hover {
background: var(--scrollbar-thumb-hover);
}
.tabs-container {
display: flex;
gap: 0;
margin-bottom: 16px;
border-bottom: 1px solid var(--border-color);
}
.tab {
background: transparent;
color: var(--text-muted);
border: none;
padding: 8px 16px;
font-size: 12px;
font-weight: 500;
cursor: pointer;
transition: color 0.1s ease;
border-bottom: 2px solid transparent;
margin-bottom: -1px;
}
.tab:hover {
color: var(--text-color);
}
.tab.active {
color: var(--text-color);
border-bottom-color: var(--text-color);
}
.saved-response-item {
padding: 12px 0;
border-bottom: 1px solid var(--border-color);
}
.saved-response-header {
display: flex;
justify-content: space-between;
align-items: flex-start;
margin-bottom: 6px;
}
.saved-response-profile {
font-size: 11px;
font-weight: 500;
color: var(--text-secondary);
text-transform: capitalize;
}
.saved-response-date {
font-size: 10px;
color: var(--text-muted);
font-family: 'SF Mono', Monaco, monospace;
}
.saved-response-content {
font-size: 12px;
color: var(--text-color);
line-height: 1.4;
user-select: text;
cursor: text;
}
.delete-button {
background: transparent;
color: var(--text-muted);
border: none;
padding: 4px;
border-radius: 3px;
cursor: pointer;
transition: all 0.1s ease;
}
.delete-button:hover {
background: rgba(241, 76, 76, 0.1);
color: var(--error-color);
}
`;
static properties = {
sessions: { type: Array },
selectedSession: { type: Object },
loading: { type: Boolean },
activeTab: { type: String },
};
constructor() {
super();
this.sessions = [];
this.selectedSession = null;
this.loading = true;
this.activeTab = 'conversation'; // 'conversation' or 'screen'
this.loadSessions();
}
connectedCallback() {
super.connectedCallback();
// Resize window for this view
resizeLayout();
}
async loadSessions() {
try {
this.loading = true;
this.sessions = await cheatingDaddy.storage.getAllSessions();
} catch (error) {
console.error('Error loading conversation sessions:', error);
this.sessions = [];
} finally {
this.loading = false;
this.requestUpdate();
}
}
async loadSelectedSession(sessionId) {
try {
const session = await cheatingDaddy.storage.getSession(sessionId);
if (session) {
this.selectedSession = session;
this.requestUpdate();
}
} catch (error) {
console.error('Error loading session:', error);
}
}
formatDate(timestamp) {
const date = new Date(timestamp);
return date.toLocaleDateString('en-US', {
month: 'short',
day: 'numeric',
year: 'numeric',
});
}
formatTime(timestamp) {
const date = new Date(timestamp);
return date.toLocaleTimeString('en-US', {
hour: '2-digit',
minute: '2-digit',
});
}
formatTimestamp(timestamp) {
const date = new Date(timestamp);
return date.toLocaleString('en-US', {
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit',
});
}
getSessionPreview(session) {
const parts = [];
if (session.messageCount > 0) {
parts.push(`${session.messageCount} messages`);
}
if (session.screenAnalysisCount > 0) {
parts.push(`${session.screenAnalysisCount} screen analysis`);
}
if (session.profile) {
const profileNames = this.getProfileNames();
parts.push(profileNames[session.profile] || session.profile);
}
return parts.length > 0 ? parts.join(' • ') : 'Empty session';
}
handleSessionClick(session) {
this.loadSelectedSession(session.sessionId);
}
handleBackClick() {
this.selectedSession = null;
this.activeTab = 'conversation';
}
handleTabClick(tab) {
this.activeTab = tab;
}
getProfileNames() {
return {
interview: 'Job Interview',
sales: 'Sales Call',
meeting: 'Business Meeting',
presentation: 'Presentation',
negotiation: 'Negotiation',
exam: 'Exam Assistant',
};
}
renderSessionsList() {
if (this.loading) {
return html`<div class="loading">Loading conversation history...</div>`;
}
if (this.sessions.length === 0) {
return html`
<div class="empty-state">
<div class="empty-state-title">No conversations yet</div>
<div>Start a session to see your conversation history here</div>
</div>
`;
}
return html`
<div class="sessions-list">
${this.sessions.map(
session => html`
<div class="session-item" @click=${() => this.handleSessionClick(session)}>
<div class="session-header">
<div class="session-date">${this.formatDate(session.createdAt)}</div>
<div class="session-time">${this.formatTime(session.createdAt)}</div>
</div>
<div class="session-preview">${this.getSessionPreview(session)}</div>
</div>
`
)}
</div>
`;
}
renderContextContent() {
const { profile, customPrompt } = this.selectedSession;
const profileNames = this.getProfileNames();
if (!profile && !customPrompt) {
return html`<div class="empty-state">No profile context available</div>`;
}
return html`
<div class="session-context">
${profile ? html`
<div class="session-context-row">
<span class="context-label">Profile:</span>
<span class="context-value">${profileNames[profile] || profile}</span>
</div>
` : ''}
${customPrompt ? html`
<div class="session-context-row">
<span class="context-label">Custom Prompt:</span>
<span class="custom-prompt-value">${customPrompt}</span>
</div>
` : ''}
</div>
`;
}
renderConversationContent() {
const { conversationHistory } = this.selectedSession;
// Flatten the conversation turns into individual messages
const messages = [];
if (conversationHistory) {
conversationHistory.forEach(turn => {
if (turn.transcription) {
messages.push({
type: 'user',
content: turn.transcription,
timestamp: turn.timestamp,
});
}
if (turn.ai_response) {
messages.push({
type: 'ai',
content: turn.ai_response,
timestamp: turn.timestamp,
});
}
});
}
if (messages.length === 0) {
return html`<div class="empty-state">No conversation data available</div>`;
}
return messages.map(message => html`<div class="message ${message.type}">${message.content}</div>`);
}
renderScreenAnalysisContent() {
const { screenAnalysisHistory } = this.selectedSession;
if (!screenAnalysisHistory || screenAnalysisHistory.length === 0) {
return html`<div class="empty-state">No screen analysis data available</div>`;
}
return screenAnalysisHistory.map(analysis => html`
<div class="message screen"><div class="analysis-meta">${this.formatTimestamp(analysis.timestamp)} ${analysis.model || 'unknown model'}</div>${analysis.response}</div>
`);
}
renderConversationView() {
if (!this.selectedSession) return html``;
const { conversationHistory, screenAnalysisHistory, profile, customPrompt } = this.selectedSession;
const hasConversation = conversationHistory && conversationHistory.length > 0;
const hasScreenAnalysis = screenAnalysisHistory && screenAnalysisHistory.length > 0;
const hasContext = profile || customPrompt;
return html`
<div class="back-header">
<button class="back-button" @click=${this.handleBackClick}>
<svg
width="16px"
height="16px"
stroke-width="1.7"
viewBox="0 0 24 24"
fill="none"
xmlns="http://www.w3.org/2000/svg"
color="currentColor"
>
<path d="M15 6L9 12L15 18" stroke="currentColor" stroke-width="1.7" stroke-linecap="round" stroke-linejoin="round"></path>
</svg>
Back to Sessions
</button>
<div class="legend">
<div class="legend-item">
<div class="legend-dot user"></div>
<span>Them</span>
</div>
<div class="legend-item">
<div class="legend-dot ai"></div>
<span>Suggestion</span>
</div>
<div class="legend-item">
<div class="legend-dot screen"></div>
<span>Screen</span>
</div>
</div>
</div>
<div class="view-tabs">
<button
class="view-tab ${this.activeTab === 'conversation' ? 'active' : ''}"
@click=${() => this.handleTabClick('conversation')}
>
Conversation ${hasConversation ? `(${conversationHistory.length})` : ''}
</button>
<button
class="view-tab ${this.activeTab === 'screen' ? 'active' : ''}"
@click=${() => this.handleTabClick('screen')}
>
Screen ${hasScreenAnalysis ? `(${screenAnalysisHistory.length})` : ''}
</button>
<button
class="view-tab ${this.activeTab === 'context' ? 'active' : ''}"
@click=${() => this.handleTabClick('context')}
>
Context ${hasContext ? '' : '(empty)'}
</button>
</div>
<div class="conversation-view">
${this.activeTab === 'conversation'
? this.renderConversationContent()
: this.activeTab === 'screen'
? this.renderScreenAnalysisContent()
: this.renderContextContent()}
</div>
`;
}
render() {
if (this.selectedSession) {
return html`<div class="history-container">${this.renderConversationView()}</div>`;
}
return html`
<div class="history-container">
${this.renderSessionsList()}
</div>
`;
}
}
customElements.define('history-view', HistoryView);

View File

@ -0,0 +1,241 @@
import { html, css, LitElement } from '../../assets/lit-core-2.7.4.min.js';
import { resizeLayout } from '../../utils/windowResize.js';
export class MainView extends LitElement {
static styles = css`
* {
font-family: 'Inter', -apple-system, BlinkMacSystemFont, sans-serif;
cursor: default;
user-select: none;
}
.welcome {
font-size: 20px;
margin-bottom: 6px;
font-weight: 500;
color: var(--text-color);
margin-top: auto;
}
.input-group {
display: flex;
gap: 10px;
margin-bottom: 16px;
}
.input-group input {
flex: 1;
}
input {
background: var(--input-background);
color: var(--text-color);
border: 1px solid var(--border-color);
padding: 10px 12px;
width: 100%;
border-radius: 3px;
font-size: 13px;
transition: border-color 0.1s ease;
}
input:focus {
outline: none;
border-color: var(--border-default);
}
input::placeholder {
color: var(--placeholder-color);
}
/* Red blink animation for empty API key */
input.api-key-error {
animation: blink-red 0.6s ease-in-out;
border-color: var(--error-color);
}
@keyframes blink-red {
0%, 100% {
border-color: var(--border-color);
}
50% {
border-color: var(--error-color);
background: rgba(241, 76, 76, 0.1);
}
}
.start-button {
background: var(--start-button-background);
color: var(--start-button-color);
border: none;
padding: 10px 16px;
border-radius: 3px;
font-size: 13px;
font-weight: 500;
white-space: nowrap;
display: flex;
align-items: center;
gap: 8px;
transition: background 0.1s ease;
}
.start-button:hover {
background: var(--start-button-hover-background);
}
.start-button.initializing {
opacity: 0.5;
cursor: not-allowed;
}
.start-button.initializing:hover {
background: var(--start-button-background);
}
.shortcut-hint {
font-size: 11px;
color: var(--text-muted);
font-family: 'SF Mono', Monaco, monospace;
}
.description {
color: var(--text-secondary);
font-size: 13px;
margin-bottom: 20px;
line-height: 1.5;
}
.link {
color: var(--text-color);
text-decoration: underline;
cursor: pointer;
text-underline-offset: 2px;
}
.link:hover {
color: var(--text-color);
}
:host {
height: 100%;
display: flex;
flex-direction: column;
width: 100%;
max-width: 480px;
}
`;
static properties = {
onStart: { type: Function },
onAPIKeyHelp: { type: Function },
isInitializing: { type: Boolean },
onLayoutModeChange: { type: Function },
showApiKeyError: { type: Boolean },
};
constructor() {
super();
this.onStart = () => {};
this.onAPIKeyHelp = () => {};
this.isInitializing = false;
this.onLayoutModeChange = () => {};
this.showApiKeyError = false;
this.boundKeydownHandler = this.handleKeydown.bind(this);
this.apiKey = '';
this._loadApiKey();
}
async _loadApiKey() {
this.apiKey = await cheatingDaddy.storage.getApiKey();
this.requestUpdate();
}
connectedCallback() {
super.connectedCallback();
window.electron?.ipcRenderer?.on('session-initializing', (event, isInitializing) => {
this.isInitializing = isInitializing;
});
// Add keyboard event listener for Ctrl+Enter (or Cmd+Enter on Mac)
document.addEventListener('keydown', this.boundKeydownHandler);
// Resize window for this view
resizeLayout();
}
disconnectedCallback() {
super.disconnectedCallback();
window.electron?.ipcRenderer?.removeAllListeners('session-initializing');
// Remove keyboard event listener
document.removeEventListener('keydown', this.boundKeydownHandler);
}
handleKeydown(e) {
const isMac = navigator.platform.toUpperCase().indexOf('MAC') >= 0;
const isStartShortcut = isMac ? e.metaKey && e.key === 'Enter' : e.ctrlKey && e.key === 'Enter';
if (isStartShortcut) {
e.preventDefault();
this.handleStartClick();
}
}
async handleInput(e) {
this.apiKey = e.target.value;
await cheatingDaddy.storage.setApiKey(e.target.value);
// Clear error state when user starts typing
if (this.showApiKeyError) {
this.showApiKeyError = false;
}
}
handleStartClick() {
if (this.isInitializing) {
return;
}
this.onStart();
}
handleAPIKeyHelpClick() {
this.onAPIKeyHelp();
}
// Method to trigger the red blink animation
triggerApiKeyError() {
this.showApiKeyError = true;
// Remove the error class after 1 second
setTimeout(() => {
this.showApiKeyError = false;
}, 1000);
}
getStartButtonText() {
const isMac = navigator.platform.toUpperCase().indexOf('MAC') >= 0;
const shortcut = isMac ? 'Cmd+Enter' : 'Ctrl+Enter';
return html`Start <span class="shortcut-hint">${shortcut}</span>`;
}
render() {
return html`
<div class="welcome">Welcome</div>
<div class="input-group">
<input
type="password"
placeholder="Enter your Gemini API Key"
.value=${this.apiKey}
@input=${this.handleInput}
class="${this.showApiKeyError ? 'api-key-error' : ''}"
/>
<button @click=${this.handleStartClick} class="start-button ${this.isInitializing ? 'initializing' : ''}">
${this.getStartButtonText()}
</button>
</div>
<p class="description">
dont have an api key?
<span @click=${this.handleAPIKeyHelpClick} class="link">get one here</span>
</p>
`;
}
}
customElements.define('main-view', MainView);

View File

@ -0,0 +1,592 @@
import { html, css, LitElement } from '../../assets/lit-core-2.7.4.min.js';
export class OnboardingView extends LitElement {
static styles = css`
* {
font-family:
'Inter',
-apple-system,
BlinkMacSystemFont,
'Segoe UI',
Roboto,
sans-serif;
cursor: default;
user-select: none;
margin: 0;
padding: 0;
box-sizing: border-box;
}
:host {
display: block;
height: 100%;
width: 100%;
position: fixed;
top: 0;
left: 0;
overflow: hidden;
}
.onboarding-container {
position: relative;
width: 100%;
height: 100%;
background: #0a0a0a;
overflow: hidden;
}
.close-button {
position: absolute;
top: 12px;
right: 12px;
z-index: 10;
background: rgba(255, 255, 255, 0.08);
border: 1px solid rgba(255, 255, 255, 0.1);
border-radius: 6px;
width: 32px;
height: 32px;
display: flex;
align-items: center;
justify-content: center;
cursor: pointer;
transition: all 0.2s ease;
color: rgba(255, 255, 255, 0.6);
}
.close-button:hover {
background: rgba(255, 255, 255, 0.12);
border-color: rgba(255, 255, 255, 0.2);
color: rgba(255, 255, 255, 0.9);
}
.close-button svg {
width: 16px;
height: 16px;
}
.gradient-canvas {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
z-index: 0;
}
.content-wrapper {
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 60px;
z-index: 1;
display: flex;
flex-direction: column;
justify-content: center;
padding: 32px 48px;
max-width: 500px;
color: #e5e5e5;
overflow: hidden;
}
.slide-icon {
width: 48px;
height: 48px;
margin-bottom: 16px;
opacity: 0.9;
display: block;
}
.slide-title {
font-size: 28px;
font-weight: 600;
margin-bottom: 12px;
color: #ffffff;
line-height: 1.3;
}
.slide-content {
font-size: 16px;
line-height: 1.5;
margin-bottom: 24px;
color: #b8b8b8;
font-weight: 400;
}
.context-textarea {
width: 100%;
height: 100px;
padding: 16px;
border: 1px solid rgba(255, 255, 255, 0.1);
border-radius: 8px;
background: rgba(255, 255, 255, 0.05);
color: #e5e5e5;
font-size: 14px;
font-family: inherit;
resize: vertical;
transition: all 0.2s ease;
margin-bottom: 24px;
}
.context-textarea::placeholder {
color: rgba(255, 255, 255, 0.4);
font-size: 14px;
}
.context-textarea:focus {
outline: none;
border-color: rgba(255, 255, 255, 0.2);
background: rgba(255, 255, 255, 0.08);
}
.feature-list {
max-width: 100%;
}
.feature-item {
display: flex;
align-items: center;
margin-bottom: 12px;
font-size: 15px;
color: #b8b8b8;
}
.feature-icon {
font-size: 16px;
margin-right: 12px;
opacity: 0.8;
}
.navigation {
position: absolute;
bottom: 0;
left: 0;
right: 0;
z-index: 2;
display: flex;
align-items: center;
justify-content: space-between;
padding: 16px 24px;
background: rgba(0, 0, 0, 0.3);
backdrop-filter: blur(10px);
border-top: 1px solid rgba(255, 255, 255, 0.05);
height: 60px;
box-sizing: border-box;
}
.nav-button {
background: rgba(255, 255, 255, 0.08);
border: 1px solid rgba(255, 255, 255, 0.1);
color: #e5e5e5;
padding: 8px 16px;
border-radius: 6px;
font-size: 13px;
font-weight: 500;
cursor: pointer;
transition: all 0.2s ease;
display: flex;
align-items: center;
justify-content: center;
min-width: 36px;
min-height: 36px;
}
.nav-button:hover {
background: rgba(255, 255, 255, 0.12);
border-color: rgba(255, 255, 255, 0.2);
}
.nav-button:active {
transform: scale(0.98);
}
.nav-button:disabled {
opacity: 0.4;
cursor: not-allowed;
}
.nav-button:disabled:hover {
background: rgba(255, 255, 255, 0.08);
border-color: rgba(255, 255, 255, 0.1);
transform: none;
}
.progress-dots {
display: flex;
gap: 12px;
align-items: center;
}
.dot {
width: 8px;
height: 8px;
border-radius: 50%;
background: rgba(255, 255, 255, 0.2);
transition: all 0.2s ease;
cursor: pointer;
}
.dot:hover {
background: rgba(255, 255, 255, 0.4);
}
.dot.active {
background: rgba(255, 255, 255, 0.8);
transform: scale(1.2);
}
`;
static properties = {
currentSlide: { type: Number },
contextText: { type: String },
onComplete: { type: Function },
onClose: { type: Function },
};
constructor() {
super();
this.currentSlide = 0;
this.contextText = '';
this.onComplete = () => {};
this.onClose = () => {};
this.canvas = null;
this.ctx = null;
this.animationId = null;
// Transition properties
this.isTransitioning = false;
this.transitionStartTime = 0;
this.transitionDuration = 800; // 800ms fade duration
this.previousColorScheme = null;
// Subtle dark color schemes for each slide
this.colorSchemes = [
// Slide 1 - Welcome (Very dark purple/gray)
[
[25, 25, 35], // Dark gray-purple
[20, 20, 30], // Darker gray
[30, 25, 40], // Slightly purple
[15, 15, 25], // Very dark
[35, 30, 45], // Muted purple
[10, 10, 20], // Almost black
],
// Slide 2 - Privacy (Dark blue-gray)
[
[20, 25, 35], // Dark blue-gray
[15, 20, 30], // Darker blue-gray
[25, 30, 40], // Slightly blue
[10, 15, 25], // Very dark blue
[30, 35, 45], // Muted blue
[5, 10, 20], // Almost black
],
// Slide 3 - Context (Dark neutral)
[
[25, 25, 25], // Neutral dark
[20, 20, 20], // Darker neutral
[30, 30, 30], // Light dark
[15, 15, 15], // Very dark
[35, 35, 35], // Lighter dark
[10, 10, 10], // Almost black
],
// Slide 4 - Features (Dark green-gray)
[
[20, 30, 25], // Dark green-gray
[15, 25, 20], // Darker green-gray
[25, 35, 30], // Slightly green
[10, 20, 15], // Very dark green
[30, 40, 35], // Muted green
[5, 15, 10], // Almost black
],
// Slide 5 - Complete (Dark warm gray)
[
[30, 25, 20], // Dark warm gray
[25, 20, 15], // Darker warm
[35, 30, 25], // Slightly warm
[20, 15, 10], // Very dark warm
[40, 35, 30], // Muted warm
[15, 10, 5], // Almost black
],
];
}
firstUpdated() {
this.canvas = this.shadowRoot.querySelector('.gradient-canvas');
this.ctx = this.canvas.getContext('2d');
this.resizeCanvas();
this.startGradientAnimation();
window.addEventListener('resize', () => this.resizeCanvas());
}
disconnectedCallback() {
super.disconnectedCallback();
if (this.animationId) {
cancelAnimationFrame(this.animationId);
}
window.removeEventListener('resize', () => this.resizeCanvas());
}
resizeCanvas() {
if (!this.canvas) return;
const rect = this.getBoundingClientRect();
this.canvas.width = rect.width;
this.canvas.height = rect.height;
}
startGradientAnimation() {
if (!this.ctx) return;
const animate = timestamp => {
this.drawGradient(timestamp);
this.animationId = requestAnimationFrame(animate);
};
animate(0);
}
drawGradient(timestamp) {
if (!this.ctx || !this.canvas) return;
const { width, height } = this.canvas;
let colors = this.colorSchemes[this.currentSlide];
// Handle color scheme transitions
if (this.isTransitioning && this.previousColorScheme) {
const elapsed = timestamp - this.transitionStartTime;
const progress = Math.min(elapsed / this.transitionDuration, 1);
// Use easing function for smoother transition
const easedProgress = this.easeInOutCubic(progress);
colors = this.interpolateColorSchemes(this.previousColorScheme, this.colorSchemes[this.currentSlide], easedProgress);
// End transition when complete
if (progress >= 1) {
this.isTransitioning = false;
this.previousColorScheme = null;
}
}
const time = timestamp * 0.0005; // Much slower animation
// Create moving gradient with subtle flow
const flowX = Math.sin(time * 0.7) * width * 0.3;
const flowY = Math.cos(time * 0.5) * height * 0.2;
const gradient = this.ctx.createLinearGradient(flowX, flowY, width + flowX * 0.5, height + flowY * 0.5);
// Very subtle color variations with movement
colors.forEach((color, index) => {
const offset = index / (colors.length - 1);
const wave = Math.sin(time + index * 0.3) * 0.05; // Very subtle wave
const r = Math.max(0, Math.min(255, color[0] + wave * 5));
const g = Math.max(0, Math.min(255, color[1] + wave * 5));
const b = Math.max(0, Math.min(255, color[2] + wave * 5));
gradient.addColorStop(offset, `rgb(${r}, ${g}, ${b})`);
});
// Fill with moving gradient
this.ctx.fillStyle = gradient;
this.ctx.fillRect(0, 0, width, height);
// Add a second layer with radial gradient for more depth
const centerX = width * 0.5 + Math.sin(time * 0.3) * width * 0.15;
const centerY = height * 0.5 + Math.cos(time * 0.4) * height * 0.1;
const radius = Math.max(width, height) * 0.8;
const radialGradient = this.ctx.createRadialGradient(centerX, centerY, 0, centerX, centerY, radius);
// Very subtle radial overlay
radialGradient.addColorStop(0, `rgba(${colors[0][0] + 10}, ${colors[0][1] + 10}, ${colors[0][2] + 10}, 0.1)`);
radialGradient.addColorStop(0.5, `rgba(${colors[2][0]}, ${colors[2][1]}, ${colors[2][2]}, 0.05)`);
radialGradient.addColorStop(
1,
`rgba(${colors[colors.length - 1][0]}, ${colors[colors.length - 1][1]}, ${colors[colors.length - 1][2]}, 0.03)`
);
this.ctx.globalCompositeOperation = 'overlay';
this.ctx.fillStyle = radialGradient;
this.ctx.fillRect(0, 0, width, height);
this.ctx.globalCompositeOperation = 'source-over';
}
nextSlide() {
if (this.currentSlide < 4) {
this.startColorTransition(this.currentSlide + 1);
} else {
this.completeOnboarding();
}
}
prevSlide() {
if (this.currentSlide > 0) {
this.startColorTransition(this.currentSlide - 1);
}
}
startColorTransition(newSlide) {
this.previousColorScheme = [...this.colorSchemes[this.currentSlide]];
this.currentSlide = newSlide;
this.isTransitioning = true;
this.transitionStartTime = performance.now();
}
// Interpolate between two color schemes
interpolateColorSchemes(scheme1, scheme2, progress) {
return scheme1.map((color1, index) => {
const color2 = scheme2[index];
return [
color1[0] + (color2[0] - color1[0]) * progress,
color1[1] + (color2[1] - color1[1]) * progress,
color1[2] + (color2[2] - color1[2]) * progress,
];
});
}
// Easing function for smooth transitions
easeInOutCubic(t) {
return t < 0.5 ? 4 * t * t * t : 1 - Math.pow(-2 * t + 2, 3) / 2;
}
handleContextInput(e) {
this.contextText = e.target.value;
}
async handleClose() {
if (window.require) {
const { ipcRenderer } = window.require('electron');
await ipcRenderer.invoke('quit-application');
}
}
async completeOnboarding() {
if (this.contextText.trim()) {
await cheatingDaddy.storage.updatePreference('customPrompt', this.contextText.trim());
}
await cheatingDaddy.storage.updateConfig('onboarded', true);
this.onComplete();
}
getSlideContent() {
const slides = [
{
icon: 'assets/onboarding/welcome.svg',
title: 'Welcome to Cheating Daddy',
content:
'Your AI assistant that listens and watches, then provides intelligent suggestions automatically during interviews and meetings.',
},
{
icon: 'assets/onboarding/security.svg',
title: 'Completely Private',
content: 'Invisible to screen sharing apps and recording software. Your secret advantage stays completely hidden from others.',
},
{
icon: 'assets/onboarding/context.svg',
title: 'Add Your Context',
content: 'Share relevant information to help the AI provide better, more personalized assistance.',
showTextarea: true,
},
{
icon: 'assets/onboarding/customize.svg',
title: 'Additional Features',
content: '',
showFeatures: true,
},
{
icon: 'assets/onboarding/ready.svg',
title: 'Ready to Go',
content: 'Add your Gemini API key in settings and start getting AI-powered assistance in real-time.',
},
];
return slides[this.currentSlide];
}
render() {
const slide = this.getSlideContent();
return html`
<div class="onboarding-container">
<button class="close-button" @click=${this.handleClose} title="Close">
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor">
<path d="M6.28 5.22a.75.75 0 0 0-1.06 1.06L8.94 10l-3.72 3.72a.75.75 0 1 0 1.06 1.06L10 11.06l3.72 3.72a.75.75 0 1 0 1.06-1.06L11.06 10l3.72-3.72a.75.75 0 0 0-1.06-1.06L10 8.94 6.28 5.22Z" />
</svg>
</button>
<canvas class="gradient-canvas"></canvas>
<div class="content-wrapper">
<img class="slide-icon" src="${slide.icon}" alt="${slide.title} icon" />
<div class="slide-title">${slide.title}</div>
<div class="slide-content">${slide.content}</div>
${slide.showTextarea
? html`
<textarea
class="context-textarea"
placeholder="Paste your resume, job description, or any relevant context here..."
.value=${this.contextText}
@input=${this.handleContextInput}
></textarea>
`
: ''}
${slide.showFeatures
? html`
<div class="feature-list">
<div class="feature-item">
<span class="feature-icon">-</span>
Customize AI behavior and responses
</div>
<div class="feature-item">
<span class="feature-icon">-</span>
Review conversation history
</div>
<div class="feature-item">
<span class="feature-icon">-</span>
Adjust capture settings and intervals
</div>
</div>
`
: ''}
</div>
<div class="navigation">
<button class="nav-button" @click=${this.prevSlide} ?disabled=${this.currentSlide === 0}>
<svg width="16px" height="16px" stroke-width="2" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M15 6L9 12L15 18" stroke="currentColor" stroke-linecap="round" stroke-linejoin="round"></path>
</svg>
</button>
<div class="progress-dots">
${[0, 1, 2, 3, 4].map(
index => html`
<div
class="dot ${index === this.currentSlide ? 'active' : ''}"
@click=${() => {
if (index !== this.currentSlide) {
this.startColorTransition(index);
}
}}
></div>
`
)}
</div>
<button class="nav-button" @click=${this.nextSlide}>
${this.currentSlide === 4
? 'Get Started'
: html`
<svg width="16px" height="16px" stroke-width="2" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M9 6L15 12L9 18" stroke="currentColor" stroke-linecap="round" stroke-linejoin="round"></path>
</svg>
`}
</button>
</div>
</div>
`;
}
}
customElements.define('onboarding-view', OnboardingView);

143
src/index.html Normal file
View File

@ -0,0 +1,143 @@
<!doctype html>
<html>
<head>
<meta http-equiv="content-security-policy" content="script-src 'self' 'unsafe-inline'" />
<title>Screen and Audio Capture</title>
<style>
:root {
/* Backgrounds - with default 0.8 transparency */
--background-transparent: transparent;
--bg-primary: rgba(30, 30, 30, 0.8);
--bg-secondary: rgba(37, 37, 38, 0.8);
--bg-tertiary: rgba(45, 45, 45, 0.8);
--bg-hover: rgba(50, 50, 50, 0.8);
/* Text */
--text-color: #e5e5e5;
--text-secondary: #a0a0a0;
--text-muted: #6b6b6b;
--description-color: #a0a0a0;
--placeholder-color: #6b6b6b;
/* Borders */
--border-color: #3c3c3c;
--border-subtle: #3c3c3c;
--border-default: #4a4a4a;
/* Component backgrounds - with default 0.8 transparency */
--header-background: rgba(30, 30, 30, 0.8);
--header-actions-color: #a0a0a0;
--main-content-background: rgba(30, 30, 30, 0.8);
--button-background: transparent;
--button-border: #3c3c3c;
--icon-button-color: #a0a0a0;
--hover-background: rgba(50, 50, 50, 0.8);
--input-background: rgba(45, 45, 45, 0.8);
--input-focus-background: rgba(45, 45, 45, 0.8);
/* Focus states - neutral */
--focus-border-color: #4a4a4a;
--focus-box-shadow: transparent;
/* Scrollbar */
--scrollbar-track: #1e1e1e;
--scrollbar-thumb: #3c3c3c;
--scrollbar-thumb-hover: #4a4a4a;
--scrollbar-background: #1e1e1e;
/* Legacy/misc */
--preview-video-background: #1e1e1e;
--preview-video-border: #3c3c3c;
--option-label-color: #e5e5e5;
--screen-option-background: #252526;
--screen-option-hover-background: #2d2d2d;
--screen-option-selected-background: #323232;
--screen-option-text: #a0a0a0;
/* Buttons */
--start-button-background: #ffffff;
--start-button-color: #1e1e1e;
--start-button-border: #ffffff;
--start-button-hover-background: #e0e0e0;
--start-button-hover-border: #e0e0e0;
--text-input-button-background: #ffffff;
--text-input-button-hover: #e0e0e0;
/* Links - neutral */
--link-color: #e5e5e5;
--key-background: #2d2d2d;
/* Status colors */
--success-color: #4ec9b0;
--warning-color: #dcdcaa;
--error-color: #f14c4c;
--danger-color: #f14c4c;
/* Layout-specific variables */
--header-padding: 8px 16px;
--header-font-size: 14px;
--header-gap: 8px;
--header-button-padding: 6px 12px;
--header-icon-padding: 6px;
--header-font-size-small: 12px;
--main-content-padding: 16px;
--main-content-margin-top: 1px;
--icon-size: 18px;
--border-radius: 3px;
--content-border-radius: 0;
}
/* Compact layout styles */
:root.compact-layout {
--header-padding: 6px 12px;
--header-font-size: 12px;
--header-gap: 6px;
--header-button-padding: 4px 8px;
--header-icon-padding: 4px;
--header-font-size-small: 10px;
--main-content-padding: 12px;
--main-content-margin-top: 1px;
--icon-size: 16px;
--border-radius: 3px;
--content-border-radius: 0;
}
html,
body {
margin: 0;
padding: 0;
height: 100%;
overflow: hidden;
background: transparent;
}
body {
font-family:
'Inter',
-apple-system,
BlinkMacSystemFont,
sans-serif;
}
* {
box-sizing: border-box;
}
cheating-daddy-app {
display: block;
width: 100%;
height: 100%;
}
</style>
</head>
<body>
<script src="assets/marked-4.3.0.min.js"></script>
<script src="assets/highlight-11.9.0.min.js"></script>
<link rel="stylesheet" href="assets/highlight-vscode-dark.min.css" />
<script type="module" src="components/app/CheatingDaddyApp.js"></script>
<cheating-daddy-app id="cheatingDaddy"></cheating-daddy-app>
<script src="script.js"></script>
<script src="utils/renderer.js"></script>
</body>
</html>

320
src/index.js Normal file
View File

@ -0,0 +1,320 @@
if (require('electron-squirrel-startup')) {
process.exit(0);
}
const { app, BrowserWindow, shell, ipcMain } = require('electron');
const { createWindow, updateGlobalShortcuts } = require('./utils/window');
const { setupAIProviderIpcHandlers } = require('./utils/ai-provider-manager');
const { stopMacOSAudioCapture } = require('./utils/gemini');
const storage = require('./storage');
const geminiSessionRef = { current: null };
let mainWindow = null;
function sendToRenderer(channel, data) {
const windows = BrowserWindow.getAllWindows();
if (windows.length > 0) {
windows[0].webContents.send(channel, data);
}
}
function createMainWindow() {
mainWindow = createWindow(sendToRenderer, geminiSessionRef);
return mainWindow;
}
app.whenReady().then(async () => {
// Initialize storage (checks version, resets if needed)
storage.initializeStorage();
createMainWindow();
setupAIProviderIpcHandlers(geminiSessionRef);
setupStorageIpcHandlers();
setupGeneralIpcHandlers();
});
app.on('window-all-closed', () => {
stopMacOSAudioCapture();
if (process.platform !== 'darwin') {
app.quit();
}
});
app.on('before-quit', () => {
stopMacOSAudioCapture();
});
app.on('activate', () => {
if (BrowserWindow.getAllWindows().length === 0) {
createMainWindow();
}
});
function setupStorageIpcHandlers() {
// ============ CONFIG ============
ipcMain.handle('storage:get-config', async () => {
try {
return { success: true, data: storage.getConfig() };
} catch (error) {
console.error('Error getting config:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:set-config', async (event, config) => {
try {
storage.setConfig(config);
return { success: true };
} catch (error) {
console.error('Error setting config:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:update-config', async (event, key, value) => {
try {
storage.updateConfig(key, value);
return { success: true };
} catch (error) {
console.error('Error updating config:', error);
return { success: false, error: error.message };
}
});
// ============ CREDENTIALS ============
ipcMain.handle('storage:get-credentials', async () => {
try {
return { success: true, data: storage.getCredentials() };
} catch (error) {
console.error('Error getting credentials:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:set-credentials', async (event, credentials) => {
try {
storage.setCredentials(credentials);
return { success: true };
} catch (error) {
console.error('Error setting credentials:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:get-api-key', async () => {
try {
return { success: true, data: storage.getApiKey() };
} catch (error) {
console.error('Error getting API key:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:set-api-key', async (event, apiKey) => {
try {
storage.setApiKey(apiKey);
return { success: true };
} catch (error) {
console.error('Error setting API key:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:get-openai-credentials', async () => {
try {
return { success: true, data: storage.getOpenAICredentials() };
} catch (error) {
console.error('Error getting OpenAI credentials:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:set-openai-credentials', async (event, config) => {
try {
storage.setOpenAICredentials(config);
return { success: true };
} catch (error) {
console.error('Error setting OpenAI credentials:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:get-openai-sdk-credentials', async () => {
try {
return { success: true, data: storage.getOpenAISDKCredentials() };
} catch (error) {
console.error('Error getting OpenAI SDK credentials:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:set-openai-sdk-credentials', async (event, config) => {
try {
storage.setOpenAISDKCredentials(config);
return { success: true };
} catch (error) {
console.error('Error setting OpenAI SDK credentials:', error);
return { success: false, error: error.message };
}
});
// ============ PREFERENCES ============
ipcMain.handle('storage:get-preferences', async () => {
try {
return { success: true, data: storage.getPreferences() };
} catch (error) {
console.error('Error getting preferences:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:set-preferences', async (event, preferences) => {
try {
storage.setPreferences(preferences);
return { success: true };
} catch (error) {
console.error('Error setting preferences:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:update-preference', async (event, key, value) => {
try {
storage.updatePreference(key, value);
return { success: true };
} catch (error) {
console.error('Error updating preference:', error);
return { success: false, error: error.message };
}
});
// ============ KEYBINDS ============
ipcMain.handle('storage:get-keybinds', async () => {
try {
return { success: true, data: storage.getKeybinds() };
} catch (error) {
console.error('Error getting keybinds:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:set-keybinds', async (event, keybinds) => {
try {
storage.setKeybinds(keybinds);
return { success: true };
} catch (error) {
console.error('Error setting keybinds:', error);
return { success: false, error: error.message };
}
});
// ============ HISTORY ============
ipcMain.handle('storage:get-all-sessions', async () => {
try {
return { success: true, data: storage.getAllSessions() };
} catch (error) {
console.error('Error getting sessions:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:get-session', async (event, sessionId) => {
try {
return { success: true, data: storage.getSession(sessionId) };
} catch (error) {
console.error('Error getting session:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:save-session', async (event, sessionId, data) => {
try {
storage.saveSession(sessionId, data);
return { success: true };
} catch (error) {
console.error('Error saving session:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:delete-session', async (event, sessionId) => {
try {
storage.deleteSession(sessionId);
return { success: true };
} catch (error) {
console.error('Error deleting session:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('storage:delete-all-sessions', async () => {
try {
storage.deleteAllSessions();
return { success: true };
} catch (error) {
console.error('Error deleting all sessions:', error);
return { success: false, error: error.message };
}
});
// ============ LIMITS ============
ipcMain.handle('storage:get-today-limits', async () => {
try {
return { success: true, data: storage.getTodayLimits() };
} catch (error) {
console.error('Error getting today limits:', error);
return { success: false, error: error.message };
}
});
// ============ CLEAR ALL ============
ipcMain.handle('storage:clear-all', async () => {
try {
storage.clearAllData();
return { success: true };
} catch (error) {
console.error('Error clearing all data:', error);
return { success: false, error: error.message };
}
});
}
function setupGeneralIpcHandlers() {
ipcMain.handle('get-app-version', async () => {
return app.getVersion();
});
ipcMain.handle('quit-application', async event => {
try {
stopMacOSAudioCapture();
app.quit();
return { success: true };
} catch (error) {
console.error('Error quitting application:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('open-external', async (event, url) => {
try {
await shell.openExternal(url);
return { success: true };
} catch (error) {
console.error('Error opening external URL:', error);
return { success: false, error: error.message };
}
});
ipcMain.on('update-keybinds', (event, newKeybinds) => {
if (mainWindow) {
// Also save to storage
storage.setKeybinds(newKeybinds);
updateGlobalShortcuts(newKeybinds, mainWindow, sendToRenderer, geminiSessionRef);
}
});
// Debug logging from renderer
ipcMain.on('log-message', (event, msg) => {
console.log(msg);
});
}

2
src/preload.js Normal file
View File

@ -0,0 +1,2 @@
// See the Electron documentation for details on how to use preload scripts:
// https://www.electronjs.org/docs/latest/tutorial/process-model#preload-scripts

508
src/storage.js Normal file
View File

@ -0,0 +1,508 @@
const fs = require('fs');
const path = require('path');
const os = require('os');
const CONFIG_VERSION = 1;
// Default values
const DEFAULT_CONFIG = {
configVersion: CONFIG_VERSION,
onboarded: false,
layout: 'normal'
};
const DEFAULT_CREDENTIALS = {
apiKey: '',
// OpenAI Realtime API settings
openaiApiKey: '',
openaiBaseUrl: '',
openaiModel: 'gpt-4o-realtime-preview-2024-12-17',
// OpenAI SDK settings (for BotHub and other providers)
openaiSdkApiKey: '',
openaiSdkBaseUrl: '',
openaiSdkModel: 'gpt-4o',
openaiSdkVisionModel: 'gpt-4o',
openaiSdkWhisperModel: 'whisper-1'
};
const DEFAULT_PREFERENCES = {
customPrompt: '',
selectedProfile: 'interview',
selectedLanguage: 'en-US',
selectedScreenshotInterval: '5',
selectedImageQuality: 'medium',
advancedMode: false,
audioMode: 'speaker_only',
fontSize: 'medium',
backgroundTransparency: 0.8,
googleSearchEnabled: false,
aiProvider: 'gemini'
};
const DEFAULT_KEYBINDS = null; // null means use system defaults
const DEFAULT_LIMITS = {
data: [] // Array of { date: 'YYYY-MM-DD', flash: { count: 0 }, flashLite: { count: 0 } }
};
// Get the config directory path based on OS
function getConfigDir() {
const platform = os.platform();
let configDir;
if (platform === 'win32') {
configDir = path.join(os.homedir(), 'AppData', 'Roaming', 'cheating-daddy-config');
} else if (platform === 'darwin') {
configDir = path.join(os.homedir(), 'Library', 'Application Support', 'cheating-daddy-config');
} else {
configDir = path.join(os.homedir(), '.config', 'cheating-daddy-config');
}
return configDir;
}
// File paths
function getConfigPath() {
return path.join(getConfigDir(), 'config.json');
}
function getCredentialsPath() {
return path.join(getConfigDir(), 'credentials.json');
}
function getPreferencesPath() {
return path.join(getConfigDir(), 'preferences.json');
}
function getKeybindsPath() {
return path.join(getConfigDir(), 'keybinds.json');
}
function getLimitsPath() {
return path.join(getConfigDir(), 'limits.json');
}
function getHistoryDir() {
return path.join(getConfigDir(), 'history');
}
// Helper to read JSON file safely
function readJsonFile(filePath, defaultValue) {
try {
if (fs.existsSync(filePath)) {
const data = fs.readFileSync(filePath, 'utf8');
return JSON.parse(data);
}
} catch (error) {
console.warn(`Error reading ${filePath}:`, error.message);
}
return defaultValue;
}
// Helper to write JSON file safely
function writeJsonFile(filePath, data) {
try {
const dir = path.dirname(filePath);
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}
fs.writeFileSync(filePath, JSON.stringify(data, null, 2), 'utf8');
return true;
} catch (error) {
console.error(`Error writing ${filePath}:`, error.message);
return false;
}
}
// Check if we need to reset (no configVersion or wrong version)
function needsReset() {
const configPath = getConfigPath();
if (!fs.existsSync(configPath)) {
return true;
}
try {
const config = JSON.parse(fs.readFileSync(configPath, 'utf8'));
return !config.configVersion || config.configVersion !== CONFIG_VERSION;
} catch {
return true;
}
}
// Wipe and reinitialize the config directory
function resetConfigDir() {
const configDir = getConfigDir();
console.log('Resetting config directory...');
// Remove existing directory if it exists
if (fs.existsSync(configDir)) {
fs.rmSync(configDir, { recursive: true, force: true });
}
// Create fresh directory structure
fs.mkdirSync(configDir, { recursive: true });
fs.mkdirSync(getHistoryDir(), { recursive: true });
// Initialize with defaults
writeJsonFile(getConfigPath(), DEFAULT_CONFIG);
writeJsonFile(getCredentialsPath(), DEFAULT_CREDENTIALS);
writeJsonFile(getPreferencesPath(), DEFAULT_PREFERENCES);
console.log('Config directory initialized with defaults');
}
// Initialize storage - call this on app startup
function initializeStorage() {
if (needsReset()) {
resetConfigDir();
} else {
// Ensure history directory exists
const historyDir = getHistoryDir();
if (!fs.existsSync(historyDir)) {
fs.mkdirSync(historyDir, { recursive: true });
}
}
}
// ============ CONFIG ============
function getConfig() {
return readJsonFile(getConfigPath(), DEFAULT_CONFIG);
}
function setConfig(config) {
const current = getConfig();
const updated = { ...current, ...config, configVersion: CONFIG_VERSION };
return writeJsonFile(getConfigPath(), updated);
}
function updateConfig(key, value) {
const config = getConfig();
config[key] = value;
return writeJsonFile(getConfigPath(), config);
}
// ============ CREDENTIALS ============
function getCredentials() {
return readJsonFile(getCredentialsPath(), DEFAULT_CREDENTIALS);
}
function setCredentials(credentials) {
const current = getCredentials();
const updated = { ...current, ...credentials };
return writeJsonFile(getCredentialsPath(), updated);
}
function getApiKey() {
return getCredentials().apiKey || '';
}
function setApiKey(apiKey) {
return setCredentials({ apiKey });
}
function getOpenAICredentials() {
const creds = getCredentials();
return {
apiKey: creds.openaiApiKey || '',
baseUrl: creds.openaiBaseUrl || '',
model: creds.openaiModel || 'gpt-4o-realtime-preview-2024-12-17'
};
}
function setOpenAICredentials(config) {
const updates = {};
if (config.apiKey !== undefined) updates.openaiApiKey = config.apiKey;
if (config.baseUrl !== undefined) updates.openaiBaseUrl = config.baseUrl;
if (config.model !== undefined) updates.openaiModel = config.model;
return setCredentials(updates);
}
function getOpenAISDKCredentials() {
const creds = getCredentials();
return {
apiKey: creds.openaiSdkApiKey || '',
baseUrl: creds.openaiSdkBaseUrl || '',
model: creds.openaiSdkModel || 'gpt-4o',
visionModel: creds.openaiSdkVisionModel || 'gpt-4o',
whisperModel: creds.openaiSdkWhisperModel || 'whisper-1'
};
}
function setOpenAISDKCredentials(config) {
const updates = {};
if (config.apiKey !== undefined) updates.openaiSdkApiKey = config.apiKey;
if (config.baseUrl !== undefined) updates.openaiSdkBaseUrl = config.baseUrl;
if (config.model !== undefined) updates.openaiSdkModel = config.model;
if (config.visionModel !== undefined) updates.openaiSdkVisionModel = config.visionModel;
if (config.whisperModel !== undefined) updates.openaiSdkWhisperModel = config.whisperModel;
return setCredentials(updates);
}
// ============ PREFERENCES ============
function getPreferences() {
const saved = readJsonFile(getPreferencesPath(), {});
return { ...DEFAULT_PREFERENCES, ...saved };
}
function setPreferences(preferences) {
const current = getPreferences();
const updated = { ...current, ...preferences };
return writeJsonFile(getPreferencesPath(), updated);
}
function updatePreference(key, value) {
const preferences = getPreferences();
preferences[key] = value;
return writeJsonFile(getPreferencesPath(), preferences);
}
// ============ KEYBINDS ============
function getKeybinds() {
return readJsonFile(getKeybindsPath(), DEFAULT_KEYBINDS);
}
function setKeybinds(keybinds) {
return writeJsonFile(getKeybindsPath(), keybinds);
}
// ============ LIMITS (Rate Limiting) ============
function getLimits() {
return readJsonFile(getLimitsPath(), DEFAULT_LIMITS);
}
function setLimits(limits) {
return writeJsonFile(getLimitsPath(), limits);
}
function getTodayDateString() {
const now = new Date();
return now.toISOString().split('T')[0]; // YYYY-MM-DD
}
function getTodayLimits() {
const limits = getLimits();
const today = getTodayDateString();
// Find today's entry
const todayEntry = limits.data.find(entry => entry.date === today);
if (todayEntry) {
return todayEntry;
}
// No entry for today - clean old entries and create new one
limits.data = limits.data.filter(entry => entry.date === today);
const newEntry = {
date: today,
flash: { count: 0 },
flashLite: { count: 0 }
};
limits.data.push(newEntry);
setLimits(limits);
return newEntry;
}
function incrementLimitCount(model) {
const limits = getLimits();
const today = getTodayDateString();
// Find or create today's entry
let todayEntry = limits.data.find(entry => entry.date === today);
if (!todayEntry) {
// Clean old entries and create new one
limits.data = [];
todayEntry = {
date: today,
flash: { count: 0 },
flashLite: { count: 0 }
};
limits.data.push(todayEntry);
} else {
// Clean old entries, keep only today
limits.data = limits.data.filter(entry => entry.date === today);
}
// Increment the appropriate model count
if (model === 'gemini-2.5-flash') {
todayEntry.flash.count++;
} else if (model === 'gemini-2.5-flash-lite') {
todayEntry.flashLite.count++;
}
setLimits(limits);
return todayEntry;
}
function getAvailableModel() {
const todayLimits = getTodayLimits();
// RPD limits: flash = 20, flash-lite = 20
// After both exhausted, fall back to flash (for paid API users)
if (todayLimits.flash.count < 20) {
return 'gemini-2.5-flash';
} else if (todayLimits.flashLite.count < 20) {
return 'gemini-2.5-flash-lite';
}
return 'gemini-2.5-flash'; // Default to flash for paid API users
}
// ============ HISTORY ============
function getSessionPath(sessionId) {
return path.join(getHistoryDir(), `${sessionId}.json`);
}
function saveSession(sessionId, data) {
const sessionPath = getSessionPath(sessionId);
// Load existing session to preserve metadata
const existingSession = readJsonFile(sessionPath, null);
const sessionData = {
sessionId,
createdAt: existingSession?.createdAt || parseInt(sessionId),
lastUpdated: Date.now(),
// Profile context - set once when session starts
profile: data.profile || existingSession?.profile || null,
customPrompt: data.customPrompt || existingSession?.customPrompt || null,
// Conversation data
conversationHistory: data.conversationHistory || existingSession?.conversationHistory || [],
screenAnalysisHistory: data.screenAnalysisHistory || existingSession?.screenAnalysisHistory || []
};
return writeJsonFile(sessionPath, sessionData);
}
function getSession(sessionId) {
return readJsonFile(getSessionPath(sessionId), null);
}
function getAllSessions() {
const historyDir = getHistoryDir();
try {
if (!fs.existsSync(historyDir)) {
return [];
}
const files = fs.readdirSync(historyDir)
.filter(f => f.endsWith('.json'))
.sort((a, b) => {
// Sort by timestamp descending (newest first)
const tsA = parseInt(a.replace('.json', ''));
const tsB = parseInt(b.replace('.json', ''));
return tsB - tsA;
});
return files.map(file => {
const sessionId = file.replace('.json', '');
const data = readJsonFile(path.join(historyDir, file), null);
if (data) {
return {
sessionId,
createdAt: data.createdAt,
lastUpdated: data.lastUpdated,
messageCount: data.conversationHistory?.length || 0,
screenAnalysisCount: data.screenAnalysisHistory?.length || 0,
profile: data.profile || null,
customPrompt: data.customPrompt || null
};
}
return null;
}).filter(Boolean);
} catch (error) {
console.error('Error reading sessions:', error.message);
return [];
}
}
function deleteSession(sessionId) {
const sessionPath = getSessionPath(sessionId);
try {
if (fs.existsSync(sessionPath)) {
fs.unlinkSync(sessionPath);
return true;
}
} catch (error) {
console.error('Error deleting session:', error.message);
}
return false;
}
function deleteAllSessions() {
const historyDir = getHistoryDir();
try {
if (fs.existsSync(historyDir)) {
const files = fs.readdirSync(historyDir).filter(f => f.endsWith('.json'));
files.forEach(file => {
fs.unlinkSync(path.join(historyDir, file));
});
}
return true;
} catch (error) {
console.error('Error deleting all sessions:', error.message);
return false;
}
}
// ============ CLEAR ALL DATA ============
function clearAllData() {
resetConfigDir();
return true;
}
module.exports = {
// Initialization
initializeStorage,
getConfigDir,
// Config
getConfig,
setConfig,
updateConfig,
// Credentials
getCredentials,
setCredentials,
getApiKey,
setApiKey,
getOpenAICredentials,
setOpenAICredentials,
getOpenAISDKCredentials,
setOpenAISDKCredentials,
// Preferences
getPreferences,
setPreferences,
updatePreference,
// Keybinds
getKeybinds,
setKeybinds,
// Limits (Rate Limiting)
getLimits,
setLimits,
getTodayLimits,
incrementLimitCount,
getAvailableModel,
// History
saveSession,
getSession,
getAllSessions,
deleteSession,
deleteAllSessions,
// Clear all
clearAllData
};

View File

@ -0,0 +1,453 @@
const { BrowserWindow, ipcMain } = require('electron');
const { getSystemPrompt } = require('./prompts');
const { getAvailableModel, incrementLimitCount, getApiKey, getOpenAICredentials, getOpenAISDKCredentials, getPreferences } = require('../storage');
// Import provider implementations
const geminiProvider = require('./gemini');
const openaiRealtimeProvider = require('./openai-realtime');
const openaiSdkProvider = require('./openai-sdk');
// Conversation tracking (shared across providers)
let currentSessionId = null;
let conversationHistory = [];
let screenAnalysisHistory = [];
let currentProfile = null;
let currentCustomPrompt = null;
let currentProvider = 'gemini'; // 'gemini', 'openai-realtime', or 'openai-sdk'
let providerConfig = {};
function sendToRenderer(channel, data) {
const windows = BrowserWindow.getAllWindows();
if (windows.length > 0) {
windows[0].webContents.send(channel, data);
}
}
function initializeNewSession(profile = null, customPrompt = null) {
currentSessionId = Date.now().toString();
conversationHistory = [];
screenAnalysisHistory = [];
currentProfile = profile;
currentCustomPrompt = customPrompt;
console.log('New conversation session started:', currentSessionId, 'profile:', profile, 'provider:', currentProvider);
if (profile) {
sendToRenderer('save-session-context', {
sessionId: currentSessionId,
profile: profile,
customPrompt: customPrompt || '',
provider: currentProvider,
});
}
}
function saveConversationTurn(transcription, aiResponse) {
if (!currentSessionId) {
initializeNewSession();
}
const conversationTurn = {
timestamp: Date.now(),
transcription: transcription.trim(),
ai_response: aiResponse.trim(),
};
conversationHistory.push(conversationTurn);
console.log('Saved conversation turn:', conversationTurn);
sendToRenderer('save-conversation-turn', {
sessionId: currentSessionId,
turn: conversationTurn,
fullHistory: conversationHistory,
});
}
function saveScreenAnalysis(prompt, response, model) {
if (!currentSessionId) {
initializeNewSession();
}
const analysisEntry = {
timestamp: Date.now(),
prompt: prompt,
response: response.trim(),
model: model,
provider: currentProvider,
};
screenAnalysisHistory.push(analysisEntry);
console.log('Saved screen analysis:', analysisEntry);
sendToRenderer('save-screen-analysis', {
sessionId: currentSessionId,
analysis: analysisEntry,
fullHistory: screenAnalysisHistory,
profile: currentProfile,
customPrompt: currentCustomPrompt,
});
}
function getCurrentSessionData() {
return {
sessionId: currentSessionId,
history: conversationHistory,
provider: currentProvider,
};
}
// Get provider configuration from storage
async function getStoredSetting(key, defaultValue) {
try {
const windows = BrowserWindow.getAllWindows();
if (windows.length > 0) {
await new Promise(resolve => setTimeout(resolve, 100));
const value = await windows[0].webContents.executeJavaScript(`
(function() {
try {
if (typeof localStorage === 'undefined') {
return '${defaultValue}';
}
const stored = localStorage.getItem('${key}');
return stored || '${defaultValue}';
} catch (e) {
return '${defaultValue}';
}
})()
`);
return value;
}
} catch (error) {
console.error('Error getting stored setting for', key, ':', error.message);
}
return defaultValue;
}
// Initialize AI session based on selected provider
async function initializeAISession(customPrompt = '', profile = 'interview', language = 'en-US') {
// Read provider from file-based storage (preferences.json)
const prefs = getPreferences();
const provider = prefs.aiProvider || 'gemini';
currentProvider = provider;
console.log('Initializing AI session with provider:', provider);
// Check if Google Search is enabled for system prompt
const googleSearchEnabled = prefs.googleSearchEnabled ?? true;
const systemPrompt = getSystemPrompt(profile, customPrompt, googleSearchEnabled);
if (provider === 'openai-realtime') {
// Get OpenAI Realtime configuration
const creds = getOpenAICredentials();
if (!creds.apiKey) {
sendToRenderer('update-status', 'OpenAI API key not configured');
return false;
}
providerConfig = {
apiKey: creds.apiKey,
baseUrl: creds.baseUrl || null,
model: creds.model,
systemPrompt,
language,
isReconnect: false,
};
initializeNewSession(profile, customPrompt);
try {
await openaiRealtimeProvider.initializeOpenAISession(providerConfig, conversationHistory);
return true;
} catch (error) {
console.error('Failed to initialize OpenAI Realtime session:', error);
sendToRenderer('update-status', 'Failed to connect to OpenAI Realtime');
return false;
}
} else if (provider === 'openai-sdk') {
// Get OpenAI SDK configuration (for BotHub, etc.)
const creds = getOpenAISDKCredentials();
if (!creds.apiKey) {
sendToRenderer('update-status', 'OpenAI SDK API key not configured');
return false;
}
providerConfig = {
apiKey: creds.apiKey,
baseUrl: creds.baseUrl || null,
model: creds.model,
visionModel: creds.visionModel,
whisperModel: creds.whisperModel,
};
initializeNewSession(profile, customPrompt);
try {
await openaiSdkProvider.initializeOpenAISDK(providerConfig);
openaiSdkProvider.setSystemPrompt(systemPrompt);
sendToRenderer('update-status', 'Ready (OpenAI SDK)');
return true;
} catch (error) {
console.error('Failed to initialize OpenAI SDK:', error);
sendToRenderer('update-status', 'Failed to initialize OpenAI SDK: ' + error.message);
return false;
}
} else {
// Use Gemini (default)
const apiKey = getApiKey();
if (!apiKey) {
sendToRenderer('update-status', 'Gemini API key not configured');
return false;
}
const session = await geminiProvider.initializeGeminiSession(apiKey, customPrompt, profile, language);
if (session && global.geminiSessionRef) {
global.geminiSessionRef.current = session;
return true;
}
return false;
}
}
// Send audio to appropriate provider
async function sendAudioContent(data, mimeType, isSystemAudio = true) {
if (currentProvider === 'openai-realtime') {
return await openaiRealtimeProvider.sendAudioToOpenAI(data);
} else if (currentProvider === 'openai-sdk') {
// OpenAI SDK buffers audio and transcribes on flush
return await openaiSdkProvider.processAudioChunk(data, mimeType);
} else {
// Gemini
if (!global.geminiSessionRef?.current) {
return { success: false, error: 'No active Gemini session' };
}
try {
const marker = isSystemAudio ? '.' : ',';
process.stdout.write(marker);
await global.geminiSessionRef.current.sendRealtimeInput({
audio: { data, mimeType },
});
return { success: true };
} catch (error) {
console.error('Error sending audio to Gemini:', error);
return { success: false, error: error.message };
}
}
}
// Send image to appropriate provider
async function sendImageContent(data, prompt) {
if (currentProvider === 'openai-realtime') {
const creds = getOpenAICredentials();
const result = await openaiRealtimeProvider.sendImageToOpenAI(data, prompt, {
apiKey: creds.apiKey,
baseUrl: creds.baseUrl,
model: creds.model,
});
if (result.success) {
saveScreenAnalysis(prompt, result.text, result.model);
}
return result;
} else if (currentProvider === 'openai-sdk') {
const result = await openaiSdkProvider.sendImageMessage(data, prompt);
if (result.success) {
saveScreenAnalysis(prompt, result.text, result.model);
}
return result;
} else {
// Use Gemini HTTP API
const result = await geminiProvider.sendImageToGeminiHttp(data, prompt);
// Screen analysis is saved inside sendImageToGeminiHttp for Gemini
return result;
}
}
// Send text message to appropriate provider
async function sendTextMessage(text) {
if (currentProvider === 'openai-realtime') {
return await openaiRealtimeProvider.sendTextToOpenAI(text);
} else if (currentProvider === 'openai-sdk') {
const result = await openaiSdkProvider.sendTextMessage(text);
if (result.success && result.text) {
saveConversationTurn(text, result.text);
}
return result;
} else {
// Gemini
if (!global.geminiSessionRef?.current) {
return { success: false, error: 'No active Gemini session' };
}
try {
console.log('Sending text message to Gemini:', text);
await global.geminiSessionRef.current.sendRealtimeInput({ text: text.trim() });
return { success: true };
} catch (error) {
console.error('Error sending text to Gemini:', error);
return { success: false, error: error.message };
}
}
}
// Close session for appropriate provider
async function closeSession() {
try {
if (currentProvider === 'openai-realtime') {
openaiRealtimeProvider.closeOpenAISession();
} else if (currentProvider === 'openai-sdk') {
openaiSdkProvider.closeOpenAISDK();
} else {
geminiProvider.stopMacOSAudioCapture();
if (global.geminiSessionRef?.current) {
await global.geminiSessionRef.current.close();
global.geminiSessionRef.current = null;
}
}
return { success: true };
} catch (error) {
console.error('Error closing session:', error);
return { success: false, error: error.message };
}
}
// Setup IPC handlers
function setupAIProviderIpcHandlers(geminiSessionRef) {
// Store reference for Gemini
global.geminiSessionRef = geminiSessionRef;
// Listen for conversation turn save requests from providers
ipcMain.on('save-conversation-turn-data', (event, { transcription, response }) => {
saveConversationTurn(transcription, response);
});
ipcMain.handle('initialize-ai-session', async (event, customPrompt, profile, language) => {
return await initializeAISession(customPrompt, profile, language);
});
ipcMain.handle('send-audio-content', async (event, { data, mimeType }) => {
return await sendAudioContent(data, mimeType, true);
});
ipcMain.handle('send-mic-audio-content', async (event, { data, mimeType }) => {
return await sendAudioContent(data, mimeType, false);
});
ipcMain.handle('send-image-content', async (event, { data, prompt }) => {
return await sendImageContent(data, prompt);
});
ipcMain.handle('send-text-message', async (event, text) => {
return await sendTextMessage(text);
});
ipcMain.handle('close-session', async event => {
return await closeSession();
});
// macOS system audio
ipcMain.handle('start-macos-audio', async event => {
if (process.platform !== 'darwin') {
return {
success: false,
error: 'macOS audio capture only available on macOS',
};
}
try {
if (currentProvider === 'gemini') {
const success = await geminiProvider.startMacOSAudioCapture(global.geminiSessionRef);
return { success };
} else if (currentProvider === 'openai-sdk') {
const success = await openaiSdkProvider.startMacOSAudioCapture();
return { success };
} else if (currentProvider === 'openai-realtime') {
// OpenAI Realtime uses WebSocket, handle differently if needed
return {
success: false,
error: 'OpenAI Realtime uses WebSocket for audio',
};
}
return {
success: false,
error: 'Unknown provider: ' + currentProvider,
};
} catch (error) {
console.error('Error starting macOS audio capture:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('stop-macos-audio', async event => {
try {
if (currentProvider === 'gemini') {
geminiProvider.stopMacOSAudioCapture();
} else if (currentProvider === 'openai-sdk') {
openaiSdkProvider.stopMacOSAudioCapture();
}
return { success: true };
} catch (error) {
console.error('Error stopping macOS audio capture:', error);
return { success: false, error: error.message };
}
});
// Session management
ipcMain.handle('get-current-session', async event => {
try {
return { success: true, data: getCurrentSessionData() };
} catch (error) {
console.error('Error getting current session:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('start-new-session', async event => {
try {
initializeNewSession();
return { success: true, sessionId: currentSessionId };
} catch (error) {
console.error('Error starting new session:', error);
return { success: false, error: error.message };
}
});
ipcMain.handle('update-google-search-setting', async (event, enabled) => {
try {
console.log('Google Search setting updated to:', enabled);
return { success: true };
} catch (error) {
console.error('Error updating Google Search setting:', error);
return { success: false, error: error.message };
}
});
// Provider switching
ipcMain.handle('switch-ai-provider', async (event, provider) => {
try {
console.log('Switching AI provider to:', provider);
currentProvider = provider;
return { success: true };
} catch (error) {
console.error('Error switching provider:', error);
return { success: false, error: error.message };
}
});
}
module.exports = {
setupAIProviderIpcHandlers,
initializeAISession,
sendAudioContent,
sendImageContent,
sendTextMessage,
closeSession,
getCurrentSessionData,
initializeNewSession,
saveConversationTurn,
};

605
src/utils/gemini.js Normal file
View File

@ -0,0 +1,605 @@
const { GoogleGenAI, Modality } = require('@google/genai');
const { BrowserWindow, ipcMain } = require('electron');
const { spawn } = require('child_process');
const { saveDebugAudio } = require('../audioUtils');
const { getSystemPrompt } = require('./prompts');
const { getAvailableModel, incrementLimitCount, getApiKey } = require('../storage');
// Conversation tracking variables
let currentSessionId = null;
let currentTranscription = '';
let conversationHistory = [];
let screenAnalysisHistory = [];
let currentProfile = null;
let currentCustomPrompt = null;
let isInitializingSession = false;
function formatSpeakerResults(results) {
let text = '';
for (const result of results) {
if (result.transcript && result.speakerId) {
const speakerLabel = result.speakerId === 1 ? 'Interviewer' : 'Candidate';
text += `[${speakerLabel}]: ${result.transcript}\n`;
}
}
return text;
}
module.exports.formatSpeakerResults = formatSpeakerResults;
// Audio capture variables
let systemAudioProc = null;
let messageBuffer = '';
// Reconnection variables
let isUserClosing = false;
let sessionParams = null;
let reconnectAttempts = 0;
const MAX_RECONNECT_ATTEMPTS = 3;
const RECONNECT_DELAY = 2000;
function sendToRenderer(channel, data) {
const windows = BrowserWindow.getAllWindows();
if (windows.length > 0) {
windows[0].webContents.send(channel, data);
}
}
// Build context message for session restoration
function buildContextMessage() {
const lastTurns = conversationHistory.slice(-20);
const validTurns = lastTurns.filter(turn => turn.transcription?.trim() && turn.ai_response?.trim());
if (validTurns.length === 0) return null;
const contextLines = validTurns.map(turn =>
`[Interviewer]: ${turn.transcription.trim()}\n[Your answer]: ${turn.ai_response.trim()}`
);
return `Session reconnected. Here's the conversation so far:\n\n${contextLines.join('\n\n')}\n\nContinue from here.`;
}
// Conversation management functions
function initializeNewSession(profile = null, customPrompt = null) {
currentSessionId = Date.now().toString();
currentTranscription = '';
conversationHistory = [];
screenAnalysisHistory = [];
currentProfile = profile;
currentCustomPrompt = customPrompt;
console.log('New conversation session started:', currentSessionId, 'profile:', profile);
// Save initial session with profile context
if (profile) {
sendToRenderer('save-session-context', {
sessionId: currentSessionId,
profile: profile,
customPrompt: customPrompt || ''
});
}
}
function saveConversationTurn(transcription, aiResponse) {
if (!currentSessionId) {
initializeNewSession();
}
const conversationTurn = {
timestamp: Date.now(),
transcription: transcription.trim(),
ai_response: aiResponse.trim(),
};
conversationHistory.push(conversationTurn);
console.log('Saved conversation turn:', conversationTurn);
// Send to renderer to save in IndexedDB
sendToRenderer('save-conversation-turn', {
sessionId: currentSessionId,
turn: conversationTurn,
fullHistory: conversationHistory,
});
}
function saveScreenAnalysis(prompt, response, model) {
if (!currentSessionId) {
initializeNewSession();
}
const analysisEntry = {
timestamp: Date.now(),
prompt: prompt,
response: response.trim(),
model: model
};
screenAnalysisHistory.push(analysisEntry);
console.log('Saved screen analysis:', analysisEntry);
// Send to renderer to save
sendToRenderer('save-screen-analysis', {
sessionId: currentSessionId,
analysis: analysisEntry,
fullHistory: screenAnalysisHistory,
profile: currentProfile,
customPrompt: currentCustomPrompt
});
}
function getCurrentSessionData() {
return {
sessionId: currentSessionId,
history: conversationHistory,
};
}
async function getEnabledTools() {
const tools = [];
// Check if Google Search is enabled (default: true)
const googleSearchEnabled = await getStoredSetting('googleSearchEnabled', 'true');
console.log('Google Search enabled:', googleSearchEnabled);
if (googleSearchEnabled === 'true') {
tools.push({ googleSearch: {} });
console.log('Added Google Search tool');
} else {
console.log('Google Search tool disabled');
}
return tools;
}
async function getStoredSetting(key, defaultValue) {
try {
const windows = BrowserWindow.getAllWindows();
if (windows.length > 0) {
// Wait a bit for the renderer to be ready
await new Promise(resolve => setTimeout(resolve, 100));
// Try to get setting from renderer process localStorage
const value = await windows[0].webContents.executeJavaScript(`
(function() {
try {
if (typeof localStorage === 'undefined') {
console.log('localStorage not available yet for ${key}');
return '${defaultValue}';
}
const stored = localStorage.getItem('${key}');
console.log('Retrieved setting ${key}:', stored);
return stored || '${defaultValue}';
} catch (e) {
console.error('Error accessing localStorage for ${key}:', e);
return '${defaultValue}';
}
})()
`);
return value;
}
} catch (error) {
console.error('Error getting stored setting for', key, ':', error.message);
}
console.log('Using default value for', key, ':', defaultValue);
return defaultValue;
}
async function initializeGeminiSession(apiKey, customPrompt = '', profile = 'interview', language = 'en-US', isReconnect = false) {
if (isInitializingSession) {
console.log('Session initialization already in progress');
return false;
}
isInitializingSession = true;
if (!isReconnect) {
sendToRenderer('session-initializing', true);
}
// Store params for reconnection
if (!isReconnect) {
sessionParams = { apiKey, customPrompt, profile, language };
reconnectAttempts = 0;
}
const client = new GoogleGenAI({
vertexai: false,
apiKey: apiKey,
httpOptions: { apiVersion: 'v1alpha' },
});
// Get enabled tools first to determine Google Search status
const enabledTools = await getEnabledTools();
const googleSearchEnabled = enabledTools.some(tool => tool.googleSearch);
const systemPrompt = getSystemPrompt(profile, customPrompt, googleSearchEnabled);
// Initialize new conversation session only on first connect
if (!isReconnect) {
initializeNewSession(profile, customPrompt);
}
try {
const session = await client.live.connect({
model: 'gemini-2.5-flash-native-audio-preview-09-2025',
callbacks: {
onopen: function () {
sendToRenderer('update-status', 'Live session connected');
},
onmessage: function (message) {
console.log('----------------', message);
// Handle input transcription (what was spoken)
if (message.serverContent?.inputTranscription?.results) {
currentTranscription += formatSpeakerResults(message.serverContent.inputTranscription.results);
} else if (message.serverContent?.inputTranscription?.text) {
const text = message.serverContent.inputTranscription.text;
if (text.trim() !== '') {
currentTranscription += text;
}
}
// Handle AI model response via output transcription (native audio model)
if (message.serverContent?.outputTranscription?.text) {
const text = message.serverContent.outputTranscription.text;
if (text.trim() === '') return; // Ignore empty transcriptions
const isNewResponse = messageBuffer === '';
messageBuffer += text;
sendToRenderer(isNewResponse ? 'new-response' : 'update-response', messageBuffer);
}
if (message.serverContent?.generationComplete) {
// Only send/save if there's actual content
if (messageBuffer.trim() !== '') {
sendToRenderer('update-response', messageBuffer);
// Save conversation turn when we have both transcription and AI response
if (currentTranscription) {
saveConversationTurn(currentTranscription, messageBuffer);
currentTranscription = ''; // Reset for next turn
}
}
messageBuffer = '';
}
if (message.serverContent?.turnComplete) {
sendToRenderer('update-status', 'Listening...');
}
},
onerror: function (e) {
console.log('Session error:', e.message);
sendToRenderer('update-status', 'Error: ' + e.message);
},
onclose: function (e) {
console.log('Session closed:', e.reason);
// Don't reconnect if user intentionally closed
if (isUserClosing) {
isUserClosing = false;
sendToRenderer('update-status', 'Session closed');
return;
}
// Attempt reconnection
if (sessionParams && reconnectAttempts < MAX_RECONNECT_ATTEMPTS) {
attemptReconnect();
} else {
sendToRenderer('update-status', 'Session closed');
}
},
},
config: {
responseModalities: [Modality.AUDIO],
proactivity: { proactiveAudio: true },
outputAudioTranscription: {},
tools: enabledTools,
// Enable speaker diarization
inputAudioTranscription: {
enableSpeakerDiarization: true,
minSpeakerCount: 2,
maxSpeakerCount: 2,
},
contextWindowCompression: { slidingWindow: {} },
speechConfig: { languageCode: language },
systemInstruction: {
parts: [{ text: systemPrompt }],
},
},
});
isInitializingSession = false;
if (!isReconnect) {
sendToRenderer('session-initializing', false);
}
return session;
} catch (error) {
console.error('Failed to initialize Gemini session:', error);
isInitializingSession = false;
if (!isReconnect) {
sendToRenderer('session-initializing', false);
}
return null;
}
}
async function attemptReconnect() {
reconnectAttempts++;
console.log(`Reconnection attempt ${reconnectAttempts}/${MAX_RECONNECT_ATTEMPTS}`);
// Clear stale buffers
messageBuffer = '';
currentTranscription = '';
sendToRenderer('update-status', `Reconnecting... (${reconnectAttempts}/${MAX_RECONNECT_ATTEMPTS})`);
// Wait before attempting
await new Promise(resolve => setTimeout(resolve, RECONNECT_DELAY));
try {
const session = await initializeGeminiSession(
sessionParams.apiKey,
sessionParams.customPrompt,
sessionParams.profile,
sessionParams.language,
true // isReconnect
);
if (session && global.geminiSessionRef) {
global.geminiSessionRef.current = session;
// Restore context from conversation history via text message
const contextMessage = buildContextMessage();
if (contextMessage) {
try {
console.log('Restoring conversation context...');
await session.sendRealtimeInput({ text: contextMessage });
} catch (contextError) {
console.error('Failed to restore context:', contextError);
// Continue without context - better than failing
}
}
// Don't reset reconnectAttempts here - let it reset on next fresh session
sendToRenderer('update-status', 'Reconnected! Listening...');
console.log('Session reconnected successfully');
return true;
}
} catch (error) {
console.error(`Reconnection attempt ${reconnectAttempts} failed:`, error);
}
// If we still have attempts left, try again
if (reconnectAttempts < MAX_RECONNECT_ATTEMPTS) {
return attemptReconnect();
}
// Max attempts reached - notify frontend
console.log('Max reconnection attempts reached');
sendToRenderer('reconnect-failed', {
message: 'Tried 3 times to reconnect. Must be upstream/network issues. Try restarting or download updated app from site.',
});
sessionParams = null;
return false;
}
function killExistingSystemAudioDump() {
return new Promise(resolve => {
console.log('Checking for existing SystemAudioDump processes...');
// Kill any existing SystemAudioDump processes
const killProc = spawn('pkill', ['-f', 'SystemAudioDump'], {
stdio: 'ignore',
});
killProc.on('close', code => {
if (code === 0) {
console.log('Killed existing SystemAudioDump processes');
} else {
console.log('No existing SystemAudioDump processes found');
}
resolve();
});
killProc.on('error', err => {
console.log('Error checking for existing processes (this is normal):', err.message);
resolve();
});
// Timeout after 2 seconds
setTimeout(() => {
killProc.kill();
resolve();
}, 2000);
});
}
async function startMacOSAudioCapture(geminiSessionRef) {
if (process.platform !== 'darwin') return false;
// Kill any existing SystemAudioDump processes first
await killExistingSystemAudioDump();
console.log('Starting macOS audio capture with SystemAudioDump...');
const { app } = require('electron');
const path = require('path');
let systemAudioPath;
if (app.isPackaged) {
systemAudioPath = path.join(process.resourcesPath, 'SystemAudioDump');
} else {
systemAudioPath = path.join(__dirname, '../assets', 'SystemAudioDump');
}
console.log('SystemAudioDump path:', systemAudioPath);
const spawnOptions = {
stdio: ['ignore', 'pipe', 'pipe'],
env: {
...process.env,
},
};
systemAudioProc = spawn(systemAudioPath, [], spawnOptions);
if (!systemAudioProc.pid) {
console.error('Failed to start SystemAudioDump');
return false;
}
console.log('SystemAudioDump started with PID:', systemAudioProc.pid);
const CHUNK_DURATION = 0.1;
const SAMPLE_RATE = 24000;
const BYTES_PER_SAMPLE = 2;
const CHANNELS = 2;
const CHUNK_SIZE = SAMPLE_RATE * BYTES_PER_SAMPLE * CHANNELS * CHUNK_DURATION;
let audioBuffer = Buffer.alloc(0);
systemAudioProc.stdout.on('data', data => {
audioBuffer = Buffer.concat([audioBuffer, data]);
while (audioBuffer.length >= CHUNK_SIZE) {
const chunk = audioBuffer.slice(0, CHUNK_SIZE);
audioBuffer = audioBuffer.slice(CHUNK_SIZE);
const monoChunk = CHANNELS === 2 ? convertStereoToMono(chunk) : chunk;
const base64Data = monoChunk.toString('base64');
sendAudioToGemini(base64Data, geminiSessionRef);
if (process.env.DEBUG_AUDIO) {
console.log(`Processed audio chunk: ${chunk.length} bytes`);
saveDebugAudio(monoChunk, 'system_audio');
}
}
const maxBufferSize = SAMPLE_RATE * BYTES_PER_SAMPLE * 1;
if (audioBuffer.length > maxBufferSize) {
audioBuffer = audioBuffer.slice(-maxBufferSize);
}
});
systemAudioProc.stderr.on('data', data => {
console.error('SystemAudioDump stderr:', data.toString());
});
systemAudioProc.on('close', code => {
console.log('SystemAudioDump process closed with code:', code);
systemAudioProc = null;
});
systemAudioProc.on('error', err => {
console.error('SystemAudioDump process error:', err);
systemAudioProc = null;
});
return true;
}
function convertStereoToMono(stereoBuffer) {
const samples = stereoBuffer.length / 4;
const monoBuffer = Buffer.alloc(samples * 2);
for (let i = 0; i < samples; i++) {
const leftSample = stereoBuffer.readInt16LE(i * 4);
monoBuffer.writeInt16LE(leftSample, i * 2);
}
return monoBuffer;
}
function stopMacOSAudioCapture() {
if (systemAudioProc) {
console.log('Stopping SystemAudioDump...');
systemAudioProc.kill('SIGTERM');
systemAudioProc = null;
}
}
async function sendAudioToGemini(base64Data, geminiSessionRef) {
if (!geminiSessionRef.current) return;
try {
process.stdout.write('.');
await geminiSessionRef.current.sendRealtimeInput({
audio: {
data: base64Data,
mimeType: 'audio/pcm;rate=24000',
},
});
} catch (error) {
console.error('Error sending audio to Gemini:', error);
}
}
async function sendImageToGeminiHttp(base64Data, prompt) {
// Get available model based on rate limits
const model = getAvailableModel();
const apiKey = getApiKey();
if (!apiKey) {
return { success: false, error: 'No API key configured' };
}
try {
const ai = new GoogleGenAI({ apiKey: apiKey });
const contents = [
{
inlineData: {
mimeType: 'image/jpeg',
data: base64Data,
},
},
{ text: prompt },
];
console.log(`Sending image to ${model} (streaming)...`);
const response = await ai.models.generateContentStream({
model: model,
contents: contents,
});
// Increment count after successful call
incrementLimitCount(model);
// Stream the response
let fullText = '';
let isFirst = true;
for await (const chunk of response) {
const chunkText = chunk.text;
if (chunkText) {
fullText += chunkText;
// Send to renderer - new response for first chunk, update for subsequent
sendToRenderer(isFirst ? 'new-response' : 'update-response', fullText);
isFirst = false;
}
}
console.log(`Image response completed from ${model}`);
// Save screen analysis to history
saveScreenAnalysis(prompt, fullText, model);
return { success: true, text: fullText, model: model };
} catch (error) {
console.error('Error sending image to Gemini HTTP:', error);
return { success: false, error: error.message };
}
}
module.exports = {
initializeGeminiSession,
getEnabledTools,
getStoredSetting,
sendToRenderer,
initializeNewSession,
saveConversationTurn,
getCurrentSessionData,
killExistingSystemAudioDump,
startMacOSAudioCapture,
convertStereoToMono,
stopMacOSAudioCapture,
sendAudioToGemini,
sendImageToGeminiHttp,
formatSpeakerResults,
};

View File

@ -0,0 +1,402 @@
const { BrowserWindow } = require('electron');
const WebSocket = require('ws');
// OpenAI Realtime API implementation
// Documentation: https://platform.openai.com/docs/api-reference/realtime
let ws = null;
let isUserClosing = false;
let sessionParams = null;
let reconnectAttempts = 0;
const MAX_RECONNECT_ATTEMPTS = 3;
const RECONNECT_DELAY = 2000;
// Message buffer for accumulating responses
let messageBuffer = '';
let currentTranscription = '';
function sendToRenderer(channel, data) {
const windows = BrowserWindow.getAllWindows();
if (windows.length > 0) {
windows[0].webContents.send(channel, data);
}
}
function buildContextMessage(conversationHistory) {
const lastTurns = conversationHistory.slice(-20);
const validTurns = lastTurns.filter(turn => turn.transcription?.trim() && turn.ai_response?.trim());
if (validTurns.length === 0) return null;
const contextLines = validTurns.map(turn => `User: ${turn.transcription.trim()}\nAssistant: ${turn.ai_response.trim()}`);
return `Session reconnected. Here's the conversation so far:\n\n${contextLines.join('\n\n')}\n\nContinue from here.`;
}
async function initializeOpenAISession(config, conversationHistory = []) {
const { apiKey, baseUrl, systemPrompt, model, language, isReconnect } = config;
if (!isReconnect) {
sessionParams = config;
reconnectAttempts = 0;
sendToRenderer('session-initializing', true);
}
// Use custom baseURL or default OpenAI endpoint
const wsUrl = baseUrl || 'wss://api.openai.com/v1/realtime';
const fullUrl = `${wsUrl}?model=${model || 'gpt-4o-realtime-preview-2024-12-17'}`;
return new Promise((resolve, reject) => {
try {
ws = new WebSocket(fullUrl, {
headers: {
Authorization: `Bearer ${apiKey}`,
'OpenAI-Beta': 'realtime=v1',
},
});
ws.on('open', () => {
console.log('OpenAI Realtime connection established');
// Configure session
const sessionConfig = {
type: 'session.update',
session: {
modalities: ['text', 'audio'],
instructions: systemPrompt,
voice: 'alloy',
input_audio_format: 'pcm16',
output_audio_format: 'pcm16',
input_audio_transcription: {
model: 'whisper-1',
},
turn_detection: {
type: 'server_vad',
threshold: 0.5,
prefix_padding_ms: 300,
silence_duration_ms: 500,
},
temperature: 0.8,
max_response_output_tokens: 4096,
},
};
ws.send(JSON.stringify(sessionConfig));
// Restore context if reconnecting
if (isReconnect && conversationHistory.length > 0) {
const contextMessage = buildContextMessage(conversationHistory);
if (contextMessage) {
ws.send(
JSON.stringify({
type: 'conversation.item.create',
item: {
type: 'message',
role: 'user',
content: [{ type: 'input_text', text: contextMessage }],
},
})
);
ws.send(JSON.stringify({ type: 'response.create' }));
}
}
sendToRenderer('update-status', 'Connected to OpenAI');
if (!isReconnect) {
sendToRenderer('session-initializing', false);
}
resolve(ws);
});
ws.on('message', data => {
try {
const event = JSON.parse(data.toString());
handleOpenAIEvent(event);
} catch (error) {
console.error('Error parsing OpenAI message:', error);
}
});
ws.on('error', error => {
console.error('OpenAI WebSocket error:', error);
sendToRenderer('update-status', 'Error: ' + error.message);
reject(error);
});
ws.on('close', (code, reason) => {
console.log(`OpenAI WebSocket closed: ${code} - ${reason}`);
if (isUserClosing) {
isUserClosing = false;
sendToRenderer('update-status', 'Session closed');
return;
}
// Attempt reconnection
if (sessionParams && reconnectAttempts < MAX_RECONNECT_ATTEMPTS) {
attemptReconnect(conversationHistory);
} else {
sendToRenderer('update-status', 'Session closed');
}
});
} catch (error) {
console.error('Failed to initialize OpenAI session:', error);
if (!isReconnect) {
sendToRenderer('session-initializing', false);
}
reject(error);
}
});
}
function handleOpenAIEvent(event) {
console.log('OpenAI event:', event.type);
switch (event.type) {
case 'session.created':
console.log('Session created:', event.session.id);
break;
case 'session.updated':
console.log('Session updated');
sendToRenderer('update-status', 'Listening...');
break;
case 'input_audio_buffer.speech_started':
console.log('Speech started');
break;
case 'input_audio_buffer.speech_stopped':
console.log('Speech stopped');
break;
case 'conversation.item.input_audio_transcription.completed':
if (event.transcript) {
currentTranscription += event.transcript;
console.log('Transcription:', event.transcript);
}
break;
case 'response.audio_transcript.delta':
if (event.delta) {
const isNewResponse = messageBuffer === '';
messageBuffer += event.delta;
sendToRenderer(isNewResponse ? 'new-response' : 'update-response', messageBuffer);
}
break;
case 'response.audio_transcript.done':
console.log('Audio transcript complete');
break;
case 'response.text.delta':
if (event.delta) {
const isNewResponse = messageBuffer === '';
messageBuffer += event.delta;
sendToRenderer(isNewResponse ? 'new-response' : 'update-response', messageBuffer);
}
break;
case 'response.done':
if (messageBuffer.trim() !== '') {
sendToRenderer('update-response', messageBuffer);
// Send conversation turn to be saved
if (currentTranscription) {
sendToRenderer('save-conversation-turn-data', {
transcription: currentTranscription,
response: messageBuffer,
});
currentTranscription = '';
}
}
messageBuffer = '';
sendToRenderer('update-status', 'Listening...');
break;
case 'error':
console.error('OpenAI error:', event.error);
sendToRenderer('update-status', 'Error: ' + event.error.message);
break;
default:
// console.log('Unhandled event type:', event.type);
break;
}
}
async function attemptReconnect(conversationHistory) {
reconnectAttempts++;
console.log(`Reconnection attempt ${reconnectAttempts}/${MAX_RECONNECT_ATTEMPTS}`);
messageBuffer = '';
currentTranscription = '';
sendToRenderer('update-status', `Reconnecting... (${reconnectAttempts}/${MAX_RECONNECT_ATTEMPTS})`);
await new Promise(resolve => setTimeout(resolve, RECONNECT_DELAY));
try {
const newConfig = { ...sessionParams, isReconnect: true };
ws = await initializeOpenAISession(newConfig, conversationHistory);
sendToRenderer('update-status', 'Reconnected! Listening...');
console.log('OpenAI session reconnected successfully');
return true;
} catch (error) {
console.error(`Reconnection attempt ${reconnectAttempts} failed:`, error);
if (reconnectAttempts < MAX_RECONNECT_ATTEMPTS) {
return attemptReconnect(conversationHistory);
}
console.log('Max reconnection attempts reached');
sendToRenderer('reconnect-failed', {
message: 'Tried 3 times to reconnect to OpenAI. Check your connection and API key.',
});
sessionParams = null;
return false;
}
}
async function sendAudioToOpenAI(base64Data) {
if (!ws || ws.readyState !== WebSocket.OPEN) {
console.error('WebSocket not connected');
return { success: false, error: 'No active connection' };
}
try {
ws.send(
JSON.stringify({
type: 'input_audio_buffer.append',
audio: base64Data,
})
);
return { success: true };
} catch (error) {
console.error('Error sending audio to OpenAI:', error);
return { success: false, error: error.message };
}
}
async function sendTextToOpenAI(text) {
if (!ws || ws.readyState !== WebSocket.OPEN) {
console.error('WebSocket not connected');
return { success: false, error: 'No active connection' };
}
try {
// Create a conversation item with user text
ws.send(
JSON.stringify({
type: 'conversation.item.create',
item: {
type: 'message',
role: 'user',
content: [{ type: 'input_text', text: text }],
},
})
);
// Trigger response generation
ws.send(JSON.stringify({ type: 'response.create' }));
return { success: true };
} catch (error) {
console.error('Error sending text to OpenAI:', error);
return { success: false, error: error.message };
}
}
async function sendImageToOpenAI(base64Data, prompt, config) {
const { apiKey, baseUrl, model } = config;
// OpenAI doesn't support images in Realtime API yet, use standard Chat Completions
const apiEndpoint = baseUrl ? `${baseUrl.replace('wss://', 'https://').replace('/v1/realtime', '')}/v1/chat/completions` : 'https://api.openai.com/v1/chat/completions';
try {
const response = await fetch(apiEndpoint, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${apiKey}`,
},
body: JSON.stringify({
model: model || 'gpt-4o',
messages: [
{
role: 'user',
content: [
{ type: 'text', text: prompt },
{
type: 'image_url',
image_url: {
url: `data:image/jpeg;base64,${base64Data}`,
},
},
],
},
],
max_tokens: 4096,
stream: true,
}),
});
if (!response.ok) {
const error = await response.text();
throw new Error(`OpenAI API error: ${response.status} - ${error}`);
}
const reader = response.body.getReader();
const decoder = new TextDecoder();
let fullText = '';
let isFirst = true;
while (true) {
const { done, value } = await reader.read();
if (done) break;
const chunk = decoder.decode(value);
const lines = chunk.split('\n').filter(line => line.trim().startsWith('data: '));
for (const line of lines) {
const data = line.replace('data: ', '');
if (data === '[DONE]') continue;
try {
const json = JSON.parse(data);
const content = json.choices[0]?.delta?.content;
if (content) {
fullText += content;
sendToRenderer(isFirst ? 'new-response' : 'update-response', fullText);
isFirst = false;
}
} catch (e) {
// Skip invalid JSON
}
}
}
return { success: true, text: fullText, model: model || 'gpt-4o' };
} catch (error) {
console.error('Error sending image to OpenAI:', error);
return { success: false, error: error.message };
}
}
function closeOpenAISession() {
isUserClosing = true;
sessionParams = null;
if (ws) {
ws.close();
ws = null;
}
}
module.exports = {
initializeOpenAISession,
sendAudioToOpenAI,
sendTextToOpenAI,
sendImageToOpenAI,
closeOpenAISession,
};

561
src/utils/openai-sdk.js Normal file
View File

@ -0,0 +1,561 @@
const { BrowserWindow } = require('electron');
const fs = require('fs');
const path = require('path');
const os = require('os');
const { spawn } = require('child_process');
// OpenAI SDK will be loaded dynamically
let OpenAI = null;
// OpenAI SDK-based provider (for BotHub, Azure, and other OpenAI-compatible APIs)
// This uses the standard Chat Completions API with Whisper for transcription
let openaiClient = null;
let currentConfig = null;
let conversationMessages = [];
let isProcessing = false;
// macOS audio capture
let systemAudioProc = null;
let audioBuffer = Buffer.alloc(0);
let transcriptionTimer = null;
const TRANSCRIPTION_INTERVAL_MS = 3000; // Transcribe every 3 seconds
const MIN_AUDIO_DURATION_MS = 500; // Minimum audio duration to transcribe
const SAMPLE_RATE = 24000;
function sendToRenderer(channel, data) {
const windows = BrowserWindow.getAllWindows();
if (windows.length > 0) {
windows[0].webContents.send(channel, data);
}
}
async function initializeOpenAISDK(config) {
const { apiKey, baseUrl, model } = config;
if (!apiKey) {
throw new Error('OpenAI API key is required');
}
// Dynamic import for ES module
if (!OpenAI) {
const openaiModule = await import('openai');
OpenAI = openaiModule.default;
}
const clientConfig = {
apiKey: apiKey,
};
// Use custom baseURL if provided
if (baseUrl && baseUrl.trim() !== '') {
clientConfig.baseURL = baseUrl;
}
openaiClient = new OpenAI(clientConfig);
currentConfig = config;
conversationMessages = [];
console.log('OpenAI SDK initialized with baseURL:', clientConfig.baseURL || 'default');
sendToRenderer('update-status', 'Ready (OpenAI SDK)');
return true;
}
function setSystemPrompt(systemPrompt) {
// Clear conversation and set system prompt
conversationMessages = [];
if (systemPrompt) {
conversationMessages.push({
role: 'system',
content: systemPrompt,
});
}
}
// Create WAV file from raw PCM data
function createWavBuffer(pcmBuffer, sampleRate = 24000, numChannels = 1, bitsPerSample = 16) {
const byteRate = sampleRate * numChannels * (bitsPerSample / 8);
const blockAlign = numChannels * (bitsPerSample / 8);
const dataSize = pcmBuffer.length;
const headerSize = 44;
const fileSize = headerSize + dataSize - 8;
const wavBuffer = Buffer.alloc(headerSize + dataSize);
// RIFF header
wavBuffer.write('RIFF', 0);
wavBuffer.writeUInt32LE(fileSize, 4);
wavBuffer.write('WAVE', 8);
// fmt chunk
wavBuffer.write('fmt ', 12);
wavBuffer.writeUInt32LE(16, 16); // fmt chunk size
wavBuffer.writeUInt16LE(1, 20); // audio format (1 = PCM)
wavBuffer.writeUInt16LE(numChannels, 22);
wavBuffer.writeUInt32LE(sampleRate, 24);
wavBuffer.writeUInt32LE(byteRate, 28);
wavBuffer.writeUInt16LE(blockAlign, 32);
wavBuffer.writeUInt16LE(bitsPerSample, 34);
// data chunk
wavBuffer.write('data', 36);
wavBuffer.writeUInt32LE(dataSize, 40);
// Copy PCM data
pcmBuffer.copy(wavBuffer, 44);
return wavBuffer;
}
async function transcribeAudio(audioBuffer, mimeType = 'audio/wav') {
if (!openaiClient) {
throw new Error('OpenAI client not initialized');
}
try {
// Save audio buffer to temp file (OpenAI SDK requires file path)
const tempDir = os.tmpdir();
const tempFile = path.join(tempDir, `audio_${Date.now()}.wav`);
// Convert base64 to buffer if needed
let buffer = audioBuffer;
if (typeof audioBuffer === 'string') {
buffer = Buffer.from(audioBuffer, 'base64');
}
// Create proper WAV file with header
const wavBuffer = createWavBuffer(buffer, SAMPLE_RATE, 1, 16);
fs.writeFileSync(tempFile, wavBuffer);
const transcription = await openaiClient.audio.transcriptions.create({
file: fs.createReadStream(tempFile),
model: currentConfig.whisperModel || 'whisper-1',
response_format: 'text',
});
// Clean up temp file
try {
fs.unlinkSync(tempFile);
} catch (e) {
// Ignore cleanup errors
}
return transcription;
} catch (error) {
console.error('Transcription error:', error);
throw error;
}
}
async function sendTextMessage(text) {
if (!openaiClient) {
return { success: false, error: 'OpenAI client not initialized' };
}
if (isProcessing) {
return { success: false, error: 'Already processing a request' };
}
isProcessing = true;
try {
// Add user message to conversation
conversationMessages.push({
role: 'user',
content: text,
});
sendToRenderer('update-status', 'Thinking...');
const stream = await openaiClient.chat.completions.create({
model: currentConfig.model || 'gpt-4o',
messages: conversationMessages,
stream: true,
max_tokens: 4096,
});
let fullResponse = '';
let isFirst = true;
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
fullResponse += content;
sendToRenderer(isFirst ? 'new-response' : 'update-response', fullResponse);
isFirst = false;
}
}
// Add assistant response to conversation
conversationMessages.push({
role: 'assistant',
content: fullResponse,
});
sendToRenderer('update-status', 'Ready');
isProcessing = false;
return { success: true, text: fullResponse };
} catch (error) {
console.error('Chat completion error:', error);
sendToRenderer('update-status', 'Error: ' + error.message);
isProcessing = false;
return { success: false, error: error.message };
}
}
async function sendImageMessage(base64Image, prompt) {
if (!openaiClient) {
return { success: false, error: 'OpenAI client not initialized' };
}
if (isProcessing) {
return { success: false, error: 'Already processing a request' };
}
isProcessing = true;
try {
sendToRenderer('update-status', 'Analyzing image...');
const messages = [
...conversationMessages,
{
role: 'user',
content: [
{ type: 'text', text: prompt },
{
type: 'image_url',
image_url: {
url: `data:image/jpeg;base64,${base64Image}`,
},
},
],
},
];
const stream = await openaiClient.chat.completions.create({
model: currentConfig.visionModel || currentConfig.model || 'gpt-4o',
messages: messages,
stream: true,
max_tokens: 4096,
});
let fullResponse = '';
let isFirst = true;
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
fullResponse += content;
sendToRenderer(isFirst ? 'new-response' : 'update-response', fullResponse);
isFirst = false;
}
}
// Add to conversation history (text only for follow-ups)
conversationMessages.push({
role: 'user',
content: prompt,
});
conversationMessages.push({
role: 'assistant',
content: fullResponse,
});
sendToRenderer('update-status', 'Ready');
isProcessing = false;
return { success: true, text: fullResponse, model: currentConfig.visionModel || currentConfig.model };
} catch (error) {
console.error('Vision error:', error);
sendToRenderer('update-status', 'Error: ' + error.message);
isProcessing = false;
return { success: false, error: error.message };
}
}
// Process audio chunk and get response
// This accumulates audio and transcribes when silence is detected
let audioChunks = [];
let lastAudioTime = 0;
const SILENCE_THRESHOLD_MS = 1500; // 1.5 seconds of silence
async function processAudioChunk(base64Audio, mimeType) {
if (!openaiClient) {
return { success: false, error: 'OpenAI client not initialized' };
}
const now = Date.now();
const buffer = Buffer.from(base64Audio, 'base64');
// Add to audio buffer
audioChunks.push(buffer);
lastAudioTime = now;
// Check for silence (no new audio for SILENCE_THRESHOLD_MS)
// This is a simple approach - in production you'd want proper VAD
return { success: true, buffering: true };
}
async function flushAudioAndTranscribe() {
if (audioChunks.length === 0) {
return { success: true, text: '' };
}
try {
// Combine all audio chunks
const combinedBuffer = Buffer.concat(audioChunks);
audioChunks = [];
// Transcribe
const transcription = await transcribeAudio(combinedBuffer);
if (transcription && transcription.trim()) {
// Send to chat
const response = await sendTextMessage(transcription);
return {
success: true,
transcription: transcription,
response: response.text,
};
}
return { success: true, text: '' };
} catch (error) {
console.error('Flush audio error:', error);
return { success: false, error: error.message };
}
}
function clearConversation() {
const systemMessage = conversationMessages.find(m => m.role === 'system');
conversationMessages = systemMessage ? [systemMessage] : [];
audioChunks = [];
}
function closeOpenAISDK() {
stopMacOSAudioCapture();
openaiClient = null;
currentConfig = null;
conversationMessages = [];
audioChunks = [];
isProcessing = false;
sendToRenderer('update-status', 'Disconnected');
}
// ============ macOS Audio Capture ============
async function killExistingSystemAudioDump() {
return new Promise(resolve => {
const { exec } = require('child_process');
exec('pkill -f SystemAudioDump', error => {
// Ignore errors (process might not exist)
setTimeout(resolve, 100);
});
});
}
function convertStereoToMono(stereoBuffer) {
const samples = stereoBuffer.length / 4;
const monoBuffer = Buffer.alloc(samples * 2);
for (let i = 0; i < samples; i++) {
const leftSample = stereoBuffer.readInt16LE(i * 4);
monoBuffer.writeInt16LE(leftSample, i * 2);
}
return monoBuffer;
}
// Calculate RMS (Root Mean Square) volume level of audio buffer
function calculateRMS(buffer) {
const samples = buffer.length / 2;
if (samples === 0) return 0;
let sumSquares = 0;
for (let i = 0; i < samples; i++) {
const sample = buffer.readInt16LE(i * 2);
sumSquares += sample * sample;
}
return Math.sqrt(sumSquares / samples);
}
// Check if audio contains speech (simple VAD based on volume threshold)
function hasSpeech(buffer, threshold = 500) {
const rms = calculateRMS(buffer);
return rms > threshold;
}
async function transcribeBufferedAudio() {
if (audioBuffer.length === 0 || isProcessing) {
return;
}
// Calculate audio duration
const bytesPerSample = 2;
const audioDurationMs = (audioBuffer.length / bytesPerSample / SAMPLE_RATE) * 1000;
if (audioDurationMs < MIN_AUDIO_DURATION_MS) {
return; // Not enough audio
}
// Check if there's actual speech in the audio (Voice Activity Detection)
if (!hasSpeech(audioBuffer)) {
// Clear buffer if it's just silence/noise
audioBuffer = Buffer.alloc(0);
return;
}
// Take current buffer and reset
const currentBuffer = audioBuffer;
audioBuffer = Buffer.alloc(0);
try {
console.log(`Transcribing ${audioDurationMs.toFixed(0)}ms of audio...`);
sendToRenderer('update-status', 'Transcribing...');
const transcription = await transcribeAudio(currentBuffer, 'audio/wav');
if (transcription && transcription.trim() && transcription.trim().length > 2) {
console.log('Transcription:', transcription);
sendToRenderer('update-status', 'Processing...');
// Send to chat
await sendTextMessage(transcription);
}
sendToRenderer('update-status', 'Listening...');
} catch (error) {
console.error('Transcription error:', error);
sendToRenderer('update-status', 'Listening...');
}
}
async function startMacOSAudioCapture() {
if (process.platform !== 'darwin') return false;
// Kill any existing SystemAudioDump processes first
await killExistingSystemAudioDump();
console.log('Starting macOS audio capture with SystemAudioDump for OpenAI SDK...');
const { app } = require('electron');
let systemAudioPath;
if (app.isPackaged) {
systemAudioPath = path.join(process.resourcesPath, 'SystemAudioDump');
} else {
systemAudioPath = path.join(__dirname, '../assets', 'SystemAudioDump');
}
console.log('SystemAudioDump path:', systemAudioPath);
const spawnOptions = {
stdio: ['ignore', 'pipe', 'pipe'],
env: {
...process.env,
},
};
systemAudioProc = spawn(systemAudioPath, [], spawnOptions);
if (!systemAudioProc.pid) {
console.error('Failed to start SystemAudioDump');
return false;
}
console.log('SystemAudioDump started with PID:', systemAudioProc.pid);
const CHUNK_DURATION = 0.1;
const BYTES_PER_SAMPLE = 2;
const CHANNELS = 2;
const CHUNK_SIZE = SAMPLE_RATE * BYTES_PER_SAMPLE * CHANNELS * CHUNK_DURATION;
let tempBuffer = Buffer.alloc(0);
systemAudioProc.stdout.on('data', data => {
tempBuffer = Buffer.concat([tempBuffer, data]);
while (tempBuffer.length >= CHUNK_SIZE) {
const chunk = tempBuffer.slice(0, CHUNK_SIZE);
tempBuffer = tempBuffer.slice(CHUNK_SIZE);
// Convert stereo to mono
const monoChunk = CHANNELS === 2 ? convertStereoToMono(chunk) : chunk;
// Add to audio buffer for transcription
audioBuffer = Buffer.concat([audioBuffer, monoChunk]);
}
// Limit buffer size (max 30 seconds of audio)
const maxBufferSize = SAMPLE_RATE * BYTES_PER_SAMPLE * 30;
if (audioBuffer.length > maxBufferSize) {
audioBuffer = audioBuffer.slice(-maxBufferSize);
}
});
systemAudioProc.stderr.on('data', data => {
console.error('SystemAudioDump stderr:', data.toString());
});
systemAudioProc.on('close', code => {
console.log('SystemAudioDump process closed with code:', code);
systemAudioProc = null;
stopTranscriptionTimer();
});
systemAudioProc.on('error', err => {
console.error('SystemAudioDump process error:', err);
systemAudioProc = null;
stopTranscriptionTimer();
});
// Start periodic transcription
startTranscriptionTimer();
sendToRenderer('update-status', 'Listening...');
return true;
}
function startTranscriptionTimer() {
stopTranscriptionTimer();
transcriptionTimer = setInterval(transcribeBufferedAudio, TRANSCRIPTION_INTERVAL_MS);
}
function stopTranscriptionTimer() {
if (transcriptionTimer) {
clearInterval(transcriptionTimer);
transcriptionTimer = null;
}
}
function stopMacOSAudioCapture() {
stopTranscriptionTimer();
if (systemAudioProc) {
console.log('Stopping SystemAudioDump for OpenAI SDK...');
systemAudioProc.kill('SIGTERM');
systemAudioProc = null;
}
audioBuffer = Buffer.alloc(0);
}
module.exports = {
initializeOpenAISDK,
setSystemPrompt,
transcribeAudio,
sendTextMessage,
sendImageMessage,
processAudioChunk,
flushAudioAndTranscribe,
clearConversation,
closeOpenAISDK,
startMacOSAudioCapture,
stopMacOSAudioCapture,
};

225
src/utils/prompts.js Normal file
View File

@ -0,0 +1,225 @@
const profilePrompts = {
interview: {
intro: `You are an AI-powered interview assistant, designed to act as a discreet on-screen teleprompter. Your mission is to help the user excel in their job interview by providing concise, impactful, and ready-to-speak answers or key talking points. Analyze the ongoing interview dialogue and, crucially, the 'User-provided context' below.`,
formatRequirements: `**RESPONSE FORMAT REQUIREMENTS:**
- Keep responses SHORT and CONCISE (1-3 sentences max)
- Use **markdown formatting** for better readability
- Use **bold** for key points and emphasis
- Use bullet points (-) for lists when appropriate
- Focus on the most essential information only`,
searchUsage: `**SEARCH TOOL USAGE:**
- If the interviewer mentions **recent events, news, or current trends** (anything from the last 6 months), **ALWAYS use Google search** to get up-to-date information
- If they ask about **company-specific information, recent acquisitions, funding, or leadership changes**, use Google search first
- If they mention **new technologies, frameworks, or industry developments**, search for the latest information
- After searching, provide a **concise, informed response** based on the real-time data`,
content: `Focus on delivering the most essential information the user needs. Your suggestions should be direct and immediately usable.
To help the user 'crack' the interview in their specific field:
1. Heavily rely on the 'User-provided context' (e.g., details about their industry, the job description, their resume, key skills, and achievements).
2. Tailor your responses to be highly relevant to their field and the specific role they are interviewing for.
Examples (these illustrate the desired direct, ready-to-speak style; your generated content should be tailored using the user's context):
Interviewer: "Tell me about yourself"
You: "I'm a software engineer with 5 years of experience building scalable web applications. I specialize in React and Node.js, and I've led development teams at two different startups. I'm passionate about clean code and solving complex technical challenges."
Interviewer: "What's your experience with React?"
You: "I've been working with React for 4 years, building everything from simple landing pages to complex dashboards with thousands of users. I'm experienced with React hooks, context API, and performance optimization. I've also worked with Next.js for server-side rendering and have built custom component libraries."
Interviewer: "Why do you want to work here?"
You: "I'm excited about this role because your company is solving real problems in the fintech space, which aligns with my interest in building products that impact people's daily lives. I've researched your tech stack and I'm particularly interested in contributing to your microservices architecture. Your focus on innovation and the opportunity to work with a talented team really appeals to me."`,
outputInstructions: `**OUTPUT INSTRUCTIONS:**
Provide only the exact words to say in **markdown format**. No coaching, no "you should" statements, no explanations - just the direct response the candidate can speak immediately. Keep it **short and impactful**.`,
},
sales: {
intro: `You are a sales call assistant. Your job is to provide the exact words the salesperson should say to prospects during sales calls. Give direct, ready-to-speak responses that are persuasive and professional.`,
formatRequirements: `**RESPONSE FORMAT REQUIREMENTS:**
- Keep responses SHORT and CONCISE (1-3 sentences max)
- Use **markdown formatting** for better readability
- Use **bold** for key points and emphasis
- Use bullet points (-) for lists when appropriate
- Focus on the most essential information only`,
searchUsage: `**SEARCH TOOL USAGE:**
- If the prospect mentions **recent industry trends, market changes, or current events**, **ALWAYS use Google search** to get up-to-date information
- If they reference **competitor information, recent funding news, or market data**, search for the latest information first
- If they ask about **new regulations, industry reports, or recent developments**, use search to provide accurate data
- After searching, provide a **concise, informed response** that demonstrates current market knowledge`,
content: `Examples:
Prospect: "Tell me about your product"
You: "Our platform helps companies like yours reduce operational costs by 30% while improving efficiency. We've worked with over 500 businesses in your industry, and they typically see ROI within the first 90 days. What specific operational challenges are you facing right now?"
Prospect: "What makes you different from competitors?"
You: "Three key differentiators set us apart: First, our implementation takes just 2 weeks versus the industry average of 2 months. Second, we provide dedicated support with response times under 4 hours. Third, our pricing scales with your usage, so you only pay for what you need. Which of these resonates most with your current situation?"
Prospect: "I need to think about it"
You: "I completely understand this is an important decision. What specific concerns can I address for you today? Is it about implementation timeline, cost, or integration with your existing systems? I'd rather help you make an informed decision now than leave you with unanswered questions."`,
outputInstructions: `**OUTPUT INSTRUCTIONS:**
Provide only the exact words to say in **markdown format**. Be persuasive but not pushy. Focus on value and addressing objections directly. Keep responses **short and impactful**.`,
},
meeting: {
intro: `You are a meeting assistant. Your job is to provide the exact words to say during professional meetings, presentations, and discussions. Give direct, ready-to-speak responses that are clear and professional.`,
formatRequirements: `**RESPONSE FORMAT REQUIREMENTS:**
- Keep responses SHORT and CONCISE (1-3 sentences max)
- Use **markdown formatting** for better readability
- Use **bold** for key points and emphasis
- Use bullet points (-) for lists when appropriate
- Focus on the most essential information only`,
searchUsage: `**SEARCH TOOL USAGE:**
- If participants mention **recent industry news, regulatory changes, or market updates**, **ALWAYS use Google search** for current information
- If they reference **competitor activities, recent reports, or current statistics**, search for the latest data first
- If they discuss **new technologies, tools, or industry developments**, use search to provide accurate insights
- After searching, provide a **concise, informed response** that adds value to the discussion`,
content: `Examples:
Participant: "What's the status on the project?"
You: "We're currently on track to meet our deadline. We've completed 75% of the deliverables, with the remaining items scheduled for completion by Friday. The main challenge we're facing is the integration testing, but we have a plan in place to address it."
Participant: "Can you walk us through the budget?"
You: "Absolutely. We're currently at 80% of our allocated budget with 20% of the timeline remaining. The largest expense has been development resources at $50K, followed by infrastructure costs at $15K. We have contingency funds available if needed for the final phase."
Participant: "What are the next steps?"
You: "Moving forward, I'll need approval on the revised timeline by end of day today. Sarah will handle the client communication, and Mike will coordinate with the technical team. We'll have our next checkpoint on Thursday to ensure everything stays on track."`,
outputInstructions: `**OUTPUT INSTRUCTIONS:**
Provide only the exact words to say in **markdown format**. Be clear, concise, and action-oriented in your responses. Keep it **short and impactful**.`,
},
presentation: {
intro: `You are a presentation coach. Your job is to provide the exact words the presenter should say during presentations, pitches, and public speaking events. Give direct, ready-to-speak responses that are engaging and confident.`,
formatRequirements: `**RESPONSE FORMAT REQUIREMENTS:**
- Keep responses SHORT and CONCISE (1-3 sentences max)
- Use **markdown formatting** for better readability
- Use **bold** for key points and emphasis
- Use bullet points (-) for lists when appropriate
- Focus on the most essential information only`,
searchUsage: `**SEARCH TOOL USAGE:**
- If the audience asks about **recent market trends, current statistics, or latest industry data**, **ALWAYS use Google search** for up-to-date information
- If they reference **recent events, new competitors, or current market conditions**, search for the latest information first
- If they inquire about **recent studies, reports, or breaking news** in your field, use search to provide accurate data
- After searching, provide a **concise, credible response** with current facts and figures`,
content: `Examples:
Audience: "Can you explain that slide again?"
You: "Of course. This slide shows our three-year growth trajectory. The blue line represents revenue, which has grown 150% year over year. The orange bars show our customer acquisition, doubling each year. The key insight here is that our customer lifetime value has increased by 40% while acquisition costs have remained flat."
Audience: "What's your competitive advantage?"
You: "Great question. Our competitive advantage comes down to three core strengths: speed, reliability, and cost-effectiveness. We deliver results 3x faster than traditional solutions, with 99.9% uptime, at 50% lower cost. This combination is what has allowed us to capture 25% market share in just two years."
Audience: "How do you plan to scale?"
You: "Our scaling strategy focuses on three pillars. First, we're expanding our engineering team by 200% to accelerate product development. Second, we're entering three new markets next quarter. Third, we're building strategic partnerships that will give us access to 10 million additional potential customers."`,
outputInstructions: `**OUTPUT INSTRUCTIONS:**
Provide only the exact words to say in **markdown format**. Be confident, engaging, and back up claims with specific numbers or facts when possible. Keep responses **short and impactful**.`,
},
negotiation: {
intro: `You are a negotiation assistant. Your job is to provide the exact words to say during business negotiations, contract discussions, and deal-making conversations. Give direct, ready-to-speak responses that are strategic and professional.`,
formatRequirements: `**RESPONSE FORMAT REQUIREMENTS:**
- Keep responses SHORT and CONCISE (1-3 sentences max)
- Use **markdown formatting** for better readability
- Use **bold** for key points and emphasis
- Use bullet points (-) for lists when appropriate
- Focus on the most essential information only`,
searchUsage: `**SEARCH TOOL USAGE:**
- If they mention **recent market pricing, current industry standards, or competitor offers**, **ALWAYS use Google search** for current benchmarks
- If they reference **recent legal changes, new regulations, or market conditions**, search for the latest information first
- If they discuss **recent company news, financial performance, or industry developments**, use search to provide informed responses
- After searching, provide a **strategic, well-informed response** that leverages current market intelligence`,
content: `Examples:
Other party: "That price is too high"
You: "I understand your concern about the investment. Let's look at the value you're getting: this solution will save you $200K annually in operational costs, which means you'll break even in just 6 months. Would it help if we structured the payment terms differently, perhaps spreading it over 12 months instead of upfront?"
Other party: "We need a better deal"
You: "I appreciate your directness. We want this to work for both parties. Our current offer is already at a 15% discount from our standard pricing. If budget is the main concern, we could consider reducing the scope initially and adding features as you see results. What specific budget range were you hoping to achieve?"
Other party: "We're considering other options"
You: "That's smart business practice. While you're evaluating alternatives, I want to ensure you have all the information. Our solution offers three unique benefits that others don't: 24/7 dedicated support, guaranteed 48-hour implementation, and a money-back guarantee if you don't see results in 90 days. How important are these factors in your decision?"`,
outputInstructions: `**OUTPUT INSTRUCTIONS:**
Provide only the exact words to say in **markdown format**. Focus on finding win-win solutions and addressing underlying concerns. Keep responses **short and impactful**.`,
},
exam: {
intro: `You are an exam assistant designed to help students pass tests efficiently. Your role is to provide direct, accurate answers to exam questions with minimal explanation - just enough to confirm the answer is correct.`,
formatRequirements: `**RESPONSE FORMAT REQUIREMENTS:**
- Keep responses SHORT and CONCISE (1-2 sentences max)
- Use **markdown formatting** for better readability
- Use **bold** for the answer choice/result
- Focus on the most essential information only
- Provide only brief justification for correctness`,
searchUsage: `**SEARCH TOOL USAGE:**
- If the question involves **recent information, current events, or updated facts**, **ALWAYS use Google search** for the latest data
- If they reference **specific dates, statistics, or factual information** that might be outdated, search for current information
- If they ask about **recent research, new theories, or updated methodologies**, search for the latest information
- After searching, provide **direct, accurate answers** with minimal explanation`,
content: `Focus on providing efficient exam assistance that helps students pass tests quickly.
**Key Principles:**
1. **Answer the question directly** - no unnecessary explanations
2. **Include the question text** to verify you've read it properly
3. **Provide the correct answer choice** clearly marked
4. **Give brief justification** for why it's correct
5. **Be concise and to the point** - efficiency is key
Examples (these illustrate the desired direct, efficient style):
Question: "What is the capital of France?"
You: "**Question**: What is the capital of France? **Answer**: Paris. **Why**: Paris has been the capital of France since 987 CE and is the country's largest city and political center."
Question: "Which of the following is a primary color? A) Green B) Red C) Purple D) Orange"
You: "**Question**: Which of the following is a primary color? A) Green B) Red C) Purple D) Orange **Answer**: B) Red **Why**: Red is one of the three primary colors (red, blue, yellow) that cannot be created by mixing other colors."
Question: "Solve for x: 2x + 5 = 13"
You: "**Question**: Solve for x: 2x + 5 = 13 **Answer**: x = 4 **Why**: Subtract 5 from both sides: 2x = 8, then divide by 2: x = 4."`,
outputInstructions: `**OUTPUT INSTRUCTIONS:**
Provide direct exam answers in **markdown format**. Include the question text, the correct answer choice, and a brief justification. Focus on efficiency and accuracy. Keep responses **short and to the point**.`,
},
};
function buildSystemPrompt(promptParts, customPrompt = '', googleSearchEnabled = true) {
const sections = [promptParts.intro, '\n\n', promptParts.formatRequirements];
// Only add search usage section if Google Search is enabled
if (googleSearchEnabled) {
sections.push('\n\n', promptParts.searchUsage);
}
sections.push('\n\n', promptParts.content, '\n\nUser-provided context\n-----\n', customPrompt, '\n-----\n\n', promptParts.outputInstructions);
return sections.join('');
}
function getSystemPrompt(profile, customPrompt = '', googleSearchEnabled = true) {
const promptParts = profilePrompts[profile] || profilePrompts.interview;
return buildSystemPrompt(promptParts, customPrompt, googleSearchEnabled);
}
module.exports = {
profilePrompts,
getSystemPrompt,
};

977
src/utils/renderer.js Normal file
View File

@ -0,0 +1,977 @@
// renderer.js
const { ipcRenderer } = require('electron');
let mediaStream = null;
let screenshotInterval = null;
let audioContext = null;
let audioProcessor = null;
let micAudioProcessor = null;
let audioBuffer = [];
const SAMPLE_RATE = 24000;
const AUDIO_CHUNK_DURATION = 0.1; // seconds
const BUFFER_SIZE = 4096; // Increased buffer size for smoother audio
let hiddenVideo = null;
let offscreenCanvas = null;
let offscreenContext = null;
let currentImageQuality = 'medium'; // Store current image quality for manual screenshots
const isLinux = process.platform === 'linux';
const isMacOS = process.platform === 'darwin';
// ============ STORAGE API ============
// Wrapper for IPC-based storage access
const storage = {
// Config
async getConfig() {
const result = await ipcRenderer.invoke('storage:get-config');
return result.success ? result.data : {};
},
async setConfig(config) {
return ipcRenderer.invoke('storage:set-config', config);
},
async updateConfig(key, value) {
return ipcRenderer.invoke('storage:update-config', key, value);
},
// Credentials
async getCredentials() {
const result = await ipcRenderer.invoke('storage:get-credentials');
return result.success ? result.data : {};
},
async setCredentials(credentials) {
return ipcRenderer.invoke('storage:set-credentials', credentials);
},
async getApiKey() {
const result = await ipcRenderer.invoke('storage:get-api-key');
return result.success ? result.data : '';
},
async setApiKey(apiKey) {
return ipcRenderer.invoke('storage:set-api-key', apiKey);
},
async getOpenAICredentials() {
const result = await ipcRenderer.invoke('storage:get-openai-credentials');
return result.success ? result.data : {};
},
async setOpenAICredentials(config) {
return ipcRenderer.invoke('storage:set-openai-credentials', config);
},
async getOpenAISDKCredentials() {
const result = await ipcRenderer.invoke('storage:get-openai-sdk-credentials');
return result.success ? result.data : {};
},
async setOpenAISDKCredentials(config) {
return ipcRenderer.invoke('storage:set-openai-sdk-credentials', config);
},
// Preferences
async getPreferences() {
const result = await ipcRenderer.invoke('storage:get-preferences');
return result.success ? result.data : {};
},
async setPreferences(preferences) {
return ipcRenderer.invoke('storage:set-preferences', preferences);
},
async updatePreference(key, value) {
return ipcRenderer.invoke('storage:update-preference', key, value);
},
// Keybinds
async getKeybinds() {
const result = await ipcRenderer.invoke('storage:get-keybinds');
return result.success ? result.data : null;
},
async setKeybinds(keybinds) {
return ipcRenderer.invoke('storage:set-keybinds', keybinds);
},
// Sessions (History)
async getAllSessions() {
const result = await ipcRenderer.invoke('storage:get-all-sessions');
return result.success ? result.data : [];
},
async getSession(sessionId) {
const result = await ipcRenderer.invoke('storage:get-session', sessionId);
return result.success ? result.data : null;
},
async saveSession(sessionId, data) {
return ipcRenderer.invoke('storage:save-session', sessionId, data);
},
async deleteSession(sessionId) {
return ipcRenderer.invoke('storage:delete-session', sessionId);
},
async deleteAllSessions() {
return ipcRenderer.invoke('storage:delete-all-sessions');
},
// Clear all
async clearAll() {
return ipcRenderer.invoke('storage:clear-all');
},
// Limits
async getTodayLimits() {
const result = await ipcRenderer.invoke('storage:get-today-limits');
return result.success ? result.data : { flash: { count: 0 }, flashLite: { count: 0 } };
}
};
// Cache for preferences to avoid async calls in hot paths
let preferencesCache = null;
async function loadPreferencesCache() {
preferencesCache = await storage.getPreferences();
return preferencesCache;
}
// Initialize preferences cache
loadPreferencesCache();
function convertFloat32ToInt16(float32Array) {
const int16Array = new Int16Array(float32Array.length);
for (let i = 0; i < float32Array.length; i++) {
// Improved scaling to prevent clipping
const s = Math.max(-1, Math.min(1, float32Array[i]));
int16Array[i] = s < 0 ? s * 0x8000 : s * 0x7fff;
}
return int16Array;
}
function arrayBufferToBase64(buffer) {
let binary = '';
const bytes = new Uint8Array(buffer);
const len = bytes.byteLength;
for (let i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
return btoa(binary);
}
async function initializeGemini(profile = 'interview', language = 'en-US') {
const prefs = await storage.getPreferences();
const success = await ipcRenderer.invoke('initialize-ai-session', prefs.customPrompt || '', profile, language);
if (success) {
cheatingDaddy.setStatus('Live');
} else {
cheatingDaddy.setStatus('error');
}
}
// Listen for status updates
ipcRenderer.on('update-status', (event, status) => {
console.log('Status update:', status);
cheatingDaddy.setStatus(status);
});
async function startCapture(screenshotIntervalSeconds = 5, imageQuality = 'medium') {
// Store the image quality for manual screenshots
currentImageQuality = imageQuality;
// Refresh preferences cache
await loadPreferencesCache();
const audioMode = preferencesCache.audioMode || 'speaker_only';
try {
if (isMacOS) {
// On macOS, use SystemAudioDump for audio and getDisplayMedia for screen
console.log('Starting macOS capture with SystemAudioDump...');
// Start macOS audio capture
const audioResult = await ipcRenderer.invoke('start-macos-audio');
if (!audioResult.success) {
throw new Error('Failed to start macOS audio capture: ' + audioResult.error);
}
// Get screen capture for screenshots
mediaStream = await navigator.mediaDevices.getDisplayMedia({
video: {
frameRate: 1,
width: { ideal: 1920 },
height: { ideal: 1080 },
},
audio: false, // Don't use browser audio on macOS
});
console.log('macOS screen capture started - audio handled by SystemAudioDump');
if (audioMode === 'mic_only' || audioMode === 'both') {
let micStream = null;
try {
micStream = await navigator.mediaDevices.getUserMedia({
audio: {
sampleRate: SAMPLE_RATE,
channelCount: 1,
echoCancellation: true,
noiseSuppression: true,
autoGainControl: true,
},
video: false,
});
console.log('macOS microphone capture started');
setupLinuxMicProcessing(micStream);
} catch (micError) {
console.warn('Failed to get microphone access on macOS:', micError);
}
}
} else if (isLinux) {
// Linux - use display media for screen capture and try to get system audio
try {
// First try to get system audio via getDisplayMedia (works on newer browsers)
mediaStream = await navigator.mediaDevices.getDisplayMedia({
video: {
frameRate: 1,
width: { ideal: 1920 },
height: { ideal: 1080 },
},
audio: {
sampleRate: SAMPLE_RATE,
channelCount: 1,
echoCancellation: false, // Don't cancel system audio
noiseSuppression: false,
autoGainControl: false,
},
});
console.log('Linux system audio capture via getDisplayMedia succeeded');
// Setup audio processing for Linux system audio
setupLinuxSystemAudioProcessing();
} catch (systemAudioError) {
console.warn('System audio via getDisplayMedia failed, trying screen-only capture:', systemAudioError);
// Fallback to screen-only capture
mediaStream = await navigator.mediaDevices.getDisplayMedia({
video: {
frameRate: 1,
width: { ideal: 1920 },
height: { ideal: 1080 },
},
audio: false,
});
}
// Additionally get microphone input for Linux based on audio mode
if (audioMode === 'mic_only' || audioMode === 'both') {
let micStream = null;
try {
micStream = await navigator.mediaDevices.getUserMedia({
audio: {
sampleRate: SAMPLE_RATE,
channelCount: 1,
echoCancellation: true,
noiseSuppression: true,
autoGainControl: true,
},
video: false,
});
console.log('Linux microphone capture started');
// Setup audio processing for microphone on Linux
setupLinuxMicProcessing(micStream);
} catch (micError) {
console.warn('Failed to get microphone access on Linux:', micError);
// Continue without microphone if permission denied
}
}
console.log('Linux capture started - system audio:', mediaStream.getAudioTracks().length > 0, 'microphone mode:', audioMode);
} else {
// Windows - use display media with loopback for system audio
mediaStream = await navigator.mediaDevices.getDisplayMedia({
video: {
frameRate: 1,
width: { ideal: 1920 },
height: { ideal: 1080 },
},
audio: {
sampleRate: SAMPLE_RATE,
channelCount: 1,
echoCancellation: true,
noiseSuppression: true,
autoGainControl: true,
},
});
console.log('Windows capture started with loopback audio');
// Setup audio processing for Windows loopback audio only
setupWindowsLoopbackProcessing();
if (audioMode === 'mic_only' || audioMode === 'both') {
let micStream = null;
try {
micStream = await navigator.mediaDevices.getUserMedia({
audio: {
sampleRate: SAMPLE_RATE,
channelCount: 1,
echoCancellation: true,
noiseSuppression: true,
autoGainControl: true,
},
video: false,
});
console.log('Windows microphone capture started');
setupLinuxMicProcessing(micStream);
} catch (micError) {
console.warn('Failed to get microphone access on Windows:', micError);
}
}
}
console.log('MediaStream obtained:', {
hasVideo: mediaStream.getVideoTracks().length > 0,
hasAudio: mediaStream.getAudioTracks().length > 0,
videoTrack: mediaStream.getVideoTracks()[0]?.getSettings(),
});
// Manual mode only - screenshots captured on demand via shortcut
console.log('Manual mode enabled - screenshots will be captured on demand only');
} catch (err) {
console.error('Error starting capture:', err);
cheatingDaddy.setStatus('error');
}
}
function setupLinuxMicProcessing(micStream) {
// Setup microphone audio processing for Linux
const micAudioContext = new AudioContext({ sampleRate: SAMPLE_RATE });
const micSource = micAudioContext.createMediaStreamSource(micStream);
const micProcessor = micAudioContext.createScriptProcessor(BUFFER_SIZE, 1, 1);
let audioBuffer = [];
const samplesPerChunk = SAMPLE_RATE * AUDIO_CHUNK_DURATION;
micProcessor.onaudioprocess = async e => {
const inputData = e.inputBuffer.getChannelData(0);
audioBuffer.push(...inputData);
// Process audio in chunks
while (audioBuffer.length >= samplesPerChunk) {
const chunk = audioBuffer.splice(0, samplesPerChunk);
const pcmData16 = convertFloat32ToInt16(chunk);
const base64Data = arrayBufferToBase64(pcmData16.buffer);
await ipcRenderer.invoke('send-mic-audio-content', {
data: base64Data,
mimeType: 'audio/pcm;rate=24000',
});
}
};
micSource.connect(micProcessor);
micProcessor.connect(micAudioContext.destination);
// Store processor reference for cleanup
micAudioProcessor = micProcessor;
}
function setupLinuxSystemAudioProcessing() {
// Setup system audio processing for Linux (from getDisplayMedia)
audioContext = new AudioContext({ sampleRate: SAMPLE_RATE });
const source = audioContext.createMediaStreamSource(mediaStream);
audioProcessor = audioContext.createScriptProcessor(BUFFER_SIZE, 1, 1);
let audioBuffer = [];
const samplesPerChunk = SAMPLE_RATE * AUDIO_CHUNK_DURATION;
audioProcessor.onaudioprocess = async e => {
const inputData = e.inputBuffer.getChannelData(0);
audioBuffer.push(...inputData);
// Process audio in chunks
while (audioBuffer.length >= samplesPerChunk) {
const chunk = audioBuffer.splice(0, samplesPerChunk);
const pcmData16 = convertFloat32ToInt16(chunk);
const base64Data = arrayBufferToBase64(pcmData16.buffer);
await ipcRenderer.invoke('send-audio-content', {
data: base64Data,
mimeType: 'audio/pcm;rate=24000',
});
}
};
source.connect(audioProcessor);
audioProcessor.connect(audioContext.destination);
}
function setupWindowsLoopbackProcessing() {
// Setup audio processing for Windows loopback audio only
audioContext = new AudioContext({ sampleRate: SAMPLE_RATE });
const source = audioContext.createMediaStreamSource(mediaStream);
audioProcessor = audioContext.createScriptProcessor(BUFFER_SIZE, 1, 1);
let audioBuffer = [];
const samplesPerChunk = SAMPLE_RATE * AUDIO_CHUNK_DURATION;
audioProcessor.onaudioprocess = async e => {
const inputData = e.inputBuffer.getChannelData(0);
audioBuffer.push(...inputData);
// Process audio in chunks
while (audioBuffer.length >= samplesPerChunk) {
const chunk = audioBuffer.splice(0, samplesPerChunk);
const pcmData16 = convertFloat32ToInt16(chunk);
const base64Data = arrayBufferToBase64(pcmData16.buffer);
await ipcRenderer.invoke('send-audio-content', {
data: base64Data,
mimeType: 'audio/pcm;rate=24000',
});
}
};
source.connect(audioProcessor);
audioProcessor.connect(audioContext.destination);
}
async function captureScreenshot(imageQuality = 'medium', isManual = false) {
console.log(`Capturing ${isManual ? 'manual' : 'automated'} screenshot...`);
if (!mediaStream) return;
// Lazy init of video element
if (!hiddenVideo) {
hiddenVideo = document.createElement('video');
hiddenVideo.srcObject = mediaStream;
hiddenVideo.muted = true;
hiddenVideo.playsInline = true;
await hiddenVideo.play();
await new Promise(resolve => {
if (hiddenVideo.readyState >= 2) return resolve();
hiddenVideo.onloadedmetadata = () => resolve();
});
// Lazy init of canvas based on video dimensions
offscreenCanvas = document.createElement('canvas');
offscreenCanvas.width = hiddenVideo.videoWidth;
offscreenCanvas.height = hiddenVideo.videoHeight;
offscreenContext = offscreenCanvas.getContext('2d');
}
// Check if video is ready
if (hiddenVideo.readyState < 2) {
console.warn('Video not ready yet, skipping screenshot');
return;
}
offscreenContext.drawImage(hiddenVideo, 0, 0, offscreenCanvas.width, offscreenCanvas.height);
// Check if image was drawn properly by sampling a pixel
const imageData = offscreenContext.getImageData(0, 0, 1, 1);
const isBlank = imageData.data.every((value, index) => {
// Check if all pixels are black (0,0,0) or transparent
return index === 3 ? true : value === 0;
});
if (isBlank) {
console.warn('Screenshot appears to be blank/black');
}
let qualityValue;
switch (imageQuality) {
case 'high':
qualityValue = 0.9;
break;
case 'medium':
qualityValue = 0.7;
break;
case 'low':
qualityValue = 0.5;
break;
default:
qualityValue = 0.7; // Default to medium
}
offscreenCanvas.toBlob(
async blob => {
if (!blob) {
console.error('Failed to create blob from canvas');
return;
}
const reader = new FileReader();
reader.onloadend = async () => {
const base64data = reader.result.split(',')[1];
// Validate base64 data
if (!base64data || base64data.length < 100) {
console.error('Invalid base64 data generated');
return;
}
const result = await ipcRenderer.invoke('send-image-content', {
data: base64data,
});
if (result.success) {
console.log(`Image sent successfully (${offscreenCanvas.width}x${offscreenCanvas.height})`);
} else {
console.error('Failed to send image:', result.error);
}
};
reader.readAsDataURL(blob);
},
'image/jpeg',
qualityValue
);
}
const MANUAL_SCREENSHOT_PROMPT = `Help me on this page, give me the answer no bs, complete answer.
So if its a code question, give me the approach in few bullet points, then the entire code. Also if theres anything else i need to know, tell me.
If its a question about the website, give me the answer no bs, complete answer.
If its a mcq question, give me the answer no bs, complete answer.`;
async function captureManualScreenshot(imageQuality = null) {
console.log('Manual screenshot triggered');
const quality = imageQuality || currentImageQuality;
if (!mediaStream) {
console.error('No media stream available');
return;
}
// Lazy init of video element
if (!hiddenVideo) {
hiddenVideo = document.createElement('video');
hiddenVideo.srcObject = mediaStream;
hiddenVideo.muted = true;
hiddenVideo.playsInline = true;
await hiddenVideo.play();
await new Promise(resolve => {
if (hiddenVideo.readyState >= 2) return resolve();
hiddenVideo.onloadedmetadata = () => resolve();
});
// Lazy init of canvas based on video dimensions
offscreenCanvas = document.createElement('canvas');
offscreenCanvas.width = hiddenVideo.videoWidth;
offscreenCanvas.height = hiddenVideo.videoHeight;
offscreenContext = offscreenCanvas.getContext('2d');
}
// Check if video is ready
if (hiddenVideo.readyState < 2) {
console.warn('Video not ready yet, skipping screenshot');
return;
}
offscreenContext.drawImage(hiddenVideo, 0, 0, offscreenCanvas.width, offscreenCanvas.height);
let qualityValue;
switch (quality) {
case 'high':
qualityValue = 0.9;
break;
case 'medium':
qualityValue = 0.7;
break;
case 'low':
qualityValue = 0.5;
break;
default:
qualityValue = 0.7;
}
offscreenCanvas.toBlob(
async blob => {
if (!blob) {
console.error('Failed to create blob from canvas');
return;
}
const reader = new FileReader();
reader.onloadend = async () => {
const base64data = reader.result.split(',')[1];
if (!base64data || base64data.length < 100) {
console.error('Invalid base64 data generated');
return;
}
// Send image with prompt to HTTP API (response streams via IPC events)
const result = await ipcRenderer.invoke('send-image-content', {
data: base64data,
prompt: MANUAL_SCREENSHOT_PROMPT,
});
if (result.success) {
console.log(`Image response completed from ${result.model}`);
// Response already displayed via streaming events (new-response/update-response)
} else {
console.error('Failed to get image response:', result.error);
cheatingDaddy.addNewResponse(`Error: ${result.error}`);
}
};
reader.readAsDataURL(blob);
},
'image/jpeg',
qualityValue
);
}
// Expose functions to global scope for external access
window.captureManualScreenshot = captureManualScreenshot;
function stopCapture() {
if (screenshotInterval) {
clearInterval(screenshotInterval);
screenshotInterval = null;
}
if (audioProcessor) {
audioProcessor.disconnect();
audioProcessor = null;
}
// Clean up microphone audio processor (Linux only)
if (micAudioProcessor) {
micAudioProcessor.disconnect();
micAudioProcessor = null;
}
if (audioContext) {
audioContext.close();
audioContext = null;
}
if (mediaStream) {
mediaStream.getTracks().forEach(track => track.stop());
mediaStream = null;
}
// Stop macOS audio capture if running
if (isMacOS) {
ipcRenderer.invoke('stop-macos-audio').catch(err => {
console.error('Error stopping macOS audio:', err);
});
}
// Clean up hidden elements
if (hiddenVideo) {
hiddenVideo.pause();
hiddenVideo.srcObject = null;
hiddenVideo = null;
}
offscreenCanvas = null;
offscreenContext = null;
}
// Send text message to Gemini
async function sendTextMessage(text) {
if (!text || text.trim().length === 0) {
console.warn('Cannot send empty text message');
return { success: false, error: 'Empty message' };
}
try {
const result = await ipcRenderer.invoke('send-text-message', text);
if (result.success) {
console.log('Text message sent successfully');
} else {
console.error('Failed to send text message:', result.error);
}
return result;
} catch (error) {
console.error('Error sending text message:', error);
return { success: false, error: error.message };
}
}
// Listen for conversation data from main process and save to storage
ipcRenderer.on('save-conversation-turn', async (event, data) => {
try {
await storage.saveSession(data.sessionId, { conversationHistory: data.fullHistory });
console.log('Conversation session saved:', data.sessionId);
} catch (error) {
console.error('Error saving conversation session:', error);
}
});
// Listen for session context (profile info) when session starts
ipcRenderer.on('save-session-context', async (event, data) => {
try {
await storage.saveSession(data.sessionId, {
profile: data.profile,
customPrompt: data.customPrompt
});
console.log('Session context saved:', data.sessionId, 'profile:', data.profile);
} catch (error) {
console.error('Error saving session context:', error);
}
});
// Listen for screen analysis responses (from ctrl+enter)
ipcRenderer.on('save-screen-analysis', async (event, data) => {
try {
await storage.saveSession(data.sessionId, {
screenAnalysisHistory: data.fullHistory,
profile: data.profile,
customPrompt: data.customPrompt
});
console.log('Screen analysis saved:', data.sessionId);
} catch (error) {
console.error('Error saving screen analysis:', error);
}
});
// Listen for emergency erase command from main process
ipcRenderer.on('clear-sensitive-data', async () => {
console.log('Clearing all data...');
await storage.clearAll();
});
// Handle shortcuts based on current view
function handleShortcut(shortcutKey) {
const currentView = cheatingDaddy.getCurrentView();
if (shortcutKey === 'ctrl+enter' || shortcutKey === 'cmd+enter') {
if (currentView === 'main') {
cheatingDaddy.element().handleStart();
} else {
captureManualScreenshot();
}
}
}
// Create reference to the main app element
const cheatingDaddyApp = document.querySelector('cheating-daddy-app');
// ============ THEME SYSTEM ============
const theme = {
themes: {
dark: {
background: '#1e1e1e',
text: '#e0e0e0', textSecondary: '#a0a0a0', textMuted: '#6b6b6b',
border: '#333333', accent: '#ffffff',
btnPrimaryBg: '#ffffff', btnPrimaryText: '#000000', btnPrimaryHover: '#e0e0e0',
tooltipBg: '#1a1a1a', tooltipText: '#ffffff',
keyBg: 'rgba(255,255,255,0.1)'
},
light: {
background: '#ffffff',
text: '#1a1a1a', textSecondary: '#555555', textMuted: '#888888',
border: '#e0e0e0', accent: '#000000',
btnPrimaryBg: '#1a1a1a', btnPrimaryText: '#ffffff', btnPrimaryHover: '#333333',
tooltipBg: '#1a1a1a', tooltipText: '#ffffff',
keyBg: 'rgba(0,0,0,0.1)'
},
midnight: {
background: '#0d1117',
text: '#c9d1d9', textSecondary: '#8b949e', textMuted: '#6e7681',
border: '#30363d', accent: '#58a6ff',
btnPrimaryBg: '#58a6ff', btnPrimaryText: '#0d1117', btnPrimaryHover: '#79b8ff',
tooltipBg: '#161b22', tooltipText: '#c9d1d9',
keyBg: 'rgba(88,166,255,0.15)'
},
sepia: {
background: '#f4ecd8',
text: '#5c4b37', textSecondary: '#7a6a56', textMuted: '#998875',
border: '#d4c8b0', accent: '#8b4513',
btnPrimaryBg: '#5c4b37', btnPrimaryText: '#f4ecd8', btnPrimaryHover: '#7a6a56',
tooltipBg: '#5c4b37', tooltipText: '#f4ecd8',
keyBg: 'rgba(92,75,55,0.15)'
},
nord: {
background: '#2e3440',
text: '#eceff4', textSecondary: '#d8dee9', textMuted: '#4c566a',
border: '#3b4252', accent: '#88c0d0',
btnPrimaryBg: '#88c0d0', btnPrimaryText: '#2e3440', btnPrimaryHover: '#8fbcbb',
tooltipBg: '#3b4252', tooltipText: '#eceff4',
keyBg: 'rgba(136,192,208,0.15)'
},
dracula: {
background: '#282a36',
text: '#f8f8f2', textSecondary: '#bd93f9', textMuted: '#6272a4',
border: '#44475a', accent: '#ff79c6',
btnPrimaryBg: '#ff79c6', btnPrimaryText: '#282a36', btnPrimaryHover: '#ff92d0',
tooltipBg: '#44475a', tooltipText: '#f8f8f2',
keyBg: 'rgba(255,121,198,0.15)'
},
abyss: {
background: '#0a0a0a',
text: '#d4d4d4', textSecondary: '#808080', textMuted: '#505050',
border: '#1a1a1a', accent: '#ffffff',
btnPrimaryBg: '#ffffff', btnPrimaryText: '#0a0a0a', btnPrimaryHover: '#d4d4d4',
tooltipBg: '#141414', tooltipText: '#d4d4d4',
keyBg: 'rgba(255,255,255,0.08)'
}
},
current: 'dark',
get(name) {
return this.themes[name] || this.themes.dark;
},
getAll() {
const names = {
dark: 'Dark',
light: 'Light',
midnight: 'Midnight Blue',
sepia: 'Sepia',
nord: 'Nord',
dracula: 'Dracula',
abyss: 'Abyss'
};
return Object.keys(this.themes).map(key => ({
value: key,
name: names[key] || key,
colors: this.themes[key]
}));
},
hexToRgb(hex) {
const result = /^#?([a-f\d]{2})([a-f\d]{2})([a-f\d]{2})$/i.exec(hex);
return result ? {
r: parseInt(result[1], 16),
g: parseInt(result[2], 16),
b: parseInt(result[3], 16)
} : { r: 30, g: 30, b: 30 };
},
lightenColor(rgb, amount) {
return {
r: Math.min(255, rgb.r + amount),
g: Math.min(255, rgb.g + amount),
b: Math.min(255, rgb.b + amount)
};
},
darkenColor(rgb, amount) {
return {
r: Math.max(0, rgb.r - amount),
g: Math.max(0, rgb.g - amount),
b: Math.max(0, rgb.b - amount)
};
},
applyBackgrounds(backgroundColor, alpha = 0.8) {
const root = document.documentElement;
const baseRgb = this.hexToRgb(backgroundColor);
// For light themes, darken; for dark themes, lighten
const isLight = (baseRgb.r + baseRgb.g + baseRgb.b) / 3 > 128;
const adjust = isLight ? this.darkenColor.bind(this) : this.lightenColor.bind(this);
const secondary = adjust(baseRgb, 7);
const tertiary = adjust(baseRgb, 15);
const hover = adjust(baseRgb, 20);
root.style.setProperty('--header-background', `rgba(${baseRgb.r}, ${baseRgb.g}, ${baseRgb.b}, ${alpha})`);
root.style.setProperty('--main-content-background', `rgba(${baseRgb.r}, ${baseRgb.g}, ${baseRgb.b}, ${alpha})`);
root.style.setProperty('--bg-primary', `rgba(${baseRgb.r}, ${baseRgb.g}, ${baseRgb.b}, ${alpha})`);
root.style.setProperty('--bg-secondary', `rgba(${secondary.r}, ${secondary.g}, ${secondary.b}, ${alpha})`);
root.style.setProperty('--bg-tertiary', `rgba(${tertiary.r}, ${tertiary.g}, ${tertiary.b}, ${alpha})`);
root.style.setProperty('--bg-hover', `rgba(${hover.r}, ${hover.g}, ${hover.b}, ${alpha})`);
root.style.setProperty('--input-background', `rgba(${tertiary.r}, ${tertiary.g}, ${tertiary.b}, ${alpha})`);
root.style.setProperty('--input-focus-background', `rgba(${tertiary.r}, ${tertiary.g}, ${tertiary.b}, ${alpha})`);
root.style.setProperty('--hover-background', `rgba(${hover.r}, ${hover.g}, ${hover.b}, ${alpha})`);
root.style.setProperty('--scrollbar-background', `rgba(${baseRgb.r}, ${baseRgb.g}, ${baseRgb.b}, ${alpha})`);
},
apply(themeName, alpha = 0.8) {
const colors = this.get(themeName);
this.current = themeName;
const root = document.documentElement;
// Text colors
root.style.setProperty('--text-color', colors.text);
root.style.setProperty('--text-secondary', colors.textSecondary);
root.style.setProperty('--text-muted', colors.textMuted);
// Border colors
root.style.setProperty('--border-color', colors.border);
root.style.setProperty('--border-default', colors.accent);
// Misc
root.style.setProperty('--placeholder-color', colors.textMuted);
root.style.setProperty('--scrollbar-thumb', colors.border);
root.style.setProperty('--scrollbar-thumb-hover', colors.textMuted);
root.style.setProperty('--key-background', colors.keyBg);
// Primary button
root.style.setProperty('--btn-primary-bg', colors.btnPrimaryBg);
root.style.setProperty('--btn-primary-text', colors.btnPrimaryText);
root.style.setProperty('--btn-primary-hover', colors.btnPrimaryHover);
// Start button (same as primary)
root.style.setProperty('--start-button-background', colors.btnPrimaryBg);
root.style.setProperty('--start-button-color', colors.btnPrimaryText);
root.style.setProperty('--start-button-hover-background', colors.btnPrimaryHover);
// Tooltip
root.style.setProperty('--tooltip-bg', colors.tooltipBg);
root.style.setProperty('--tooltip-text', colors.tooltipText);
// Error color (stays constant)
root.style.setProperty('--error-color', '#f14c4c');
root.style.setProperty('--success-color', '#4caf50');
// Also apply background colors from theme
this.applyBackgrounds(colors.background, alpha);
},
async load() {
try {
const prefs = await storage.getPreferences();
const themeName = prefs.theme || 'dark';
const alpha = prefs.backgroundTransparency ?? 0.8;
this.apply(themeName, alpha);
return themeName;
} catch (err) {
this.apply('dark');
return 'dark';
}
},
async save(themeName) {
await storage.updatePreference('theme', themeName);
this.apply(themeName);
}
};
// Consolidated cheatingDaddy object - all functions in one place
const cheatingDaddy = {
// App version
getVersion: async () => ipcRenderer.invoke('get-app-version'),
// Element access
element: () => cheatingDaddyApp,
e: () => cheatingDaddyApp,
// App state functions - access properties directly from the app element
getCurrentView: () => cheatingDaddyApp.currentView,
getLayoutMode: () => cheatingDaddyApp.layoutMode,
// Status and response functions
setStatus: text => cheatingDaddyApp.setStatus(text),
addNewResponse: response => cheatingDaddyApp.addNewResponse(response),
updateCurrentResponse: response => cheatingDaddyApp.updateCurrentResponse(response),
// Core functionality
initializeGemini,
startCapture,
stopCapture,
sendTextMessage,
handleShortcut,
// Storage API
storage,
// Theme API
theme,
// Refresh preferences cache (call after updating preferences)
refreshPreferencesCache: loadPreferencesCache,
// Platform detection
isLinux: isLinux,
isMacOS: isMacOS,
};
// Make it globally available
window.cheatingDaddy = cheatingDaddy;
// Load theme after DOM is ready
if (document.readyState === 'loading') {
document.addEventListener('DOMContentLoaded', () => theme.load());
} else {
theme.load();
}

503
src/utils/window.js Normal file
View File

@ -0,0 +1,503 @@
const { BrowserWindow, globalShortcut, ipcMain, screen } = require('electron');
const path = require('node:path');
const fs = require('node:fs');
const os = require('os');
const storage = require('../storage');
let mouseEventsIgnored = false;
let windowResizing = false;
let resizeAnimation = null;
const RESIZE_ANIMATION_DURATION = 500; // milliseconds
function createWindow(sendToRenderer, geminiSessionRef) {
// Get layout preference (default to 'normal')
let windowWidth = 1100;
let windowHeight = 800;
const mainWindow = new BrowserWindow({
width: windowWidth,
height: windowHeight,
frame: false,
transparent: true,
hasShadow: false,
alwaysOnTop: true,
webPreferences: {
nodeIntegration: true,
contextIsolation: false, // TODO: change to true
backgroundThrottling: false,
enableBlinkFeatures: 'GetDisplayMedia',
webSecurity: true,
allowRunningInsecureContent: false,
},
backgroundColor: '#00000000',
});
const { session, desktopCapturer } = require('electron');
session.defaultSession.setDisplayMediaRequestHandler(
(request, callback) => {
desktopCapturer.getSources({ types: ['screen'] }).then(sources => {
callback({ video: sources[0], audio: 'loopback' });
});
},
{ useSystemPicker: true }
);
mainWindow.setResizable(false);
mainWindow.setContentProtection(true);
mainWindow.setVisibleOnAllWorkspaces(true, { visibleOnFullScreen: true });
// Hide from Windows taskbar
if (process.platform === 'win32') {
try {
mainWindow.setSkipTaskbar(true);
console.log('Hidden from Windows taskbar');
} catch (error) {
console.warn('Could not hide from taskbar:', error.message);
}
}
// Hide from Mission Control on macOS
if (process.platform === 'darwin') {
try {
mainWindow.setHiddenInMissionControl(true);
console.log('Hidden from macOS Mission Control');
} catch (error) {
console.warn('Could not hide from Mission Control:', error.message);
}
}
// Center window at the top of the screen
const primaryDisplay = screen.getPrimaryDisplay();
const { width: screenWidth } = primaryDisplay.workAreaSize;
const x = Math.floor((screenWidth - windowWidth) / 2);
const y = 0;
mainWindow.setPosition(x, y);
if (process.platform === 'win32') {
mainWindow.setAlwaysOnTop(true, 'screen-saver', 1);
}
mainWindow.loadFile(path.join(__dirname, '../index.html'));
// After window is created, initialize keybinds
mainWindow.webContents.once('dom-ready', () => {
setTimeout(() => {
const defaultKeybinds = getDefaultKeybinds();
let keybinds = defaultKeybinds;
// Load keybinds from storage
const savedKeybinds = storage.getKeybinds();
if (savedKeybinds) {
keybinds = { ...defaultKeybinds, ...savedKeybinds };
}
updateGlobalShortcuts(keybinds, mainWindow, sendToRenderer, geminiSessionRef);
}, 150);
});
setupWindowIpcHandlers(mainWindow, sendToRenderer, geminiSessionRef);
return mainWindow;
}
function getDefaultKeybinds() {
const isMac = process.platform === 'darwin';
return {
moveUp: isMac ? 'Alt+Up' : 'Ctrl+Up',
moveDown: isMac ? 'Alt+Down' : 'Ctrl+Down',
moveLeft: isMac ? 'Alt+Left' : 'Ctrl+Left',
moveRight: isMac ? 'Alt+Right' : 'Ctrl+Right',
toggleVisibility: isMac ? 'Cmd+\\' : 'Ctrl+\\',
toggleClickThrough: isMac ? 'Cmd+M' : 'Ctrl+M',
nextStep: isMac ? 'Cmd+Enter' : 'Ctrl+Enter',
previousResponse: isMac ? 'Cmd+[' : 'Ctrl+[',
nextResponse: isMac ? 'Cmd+]' : 'Ctrl+]',
scrollUp: isMac ? 'Cmd+Shift+Up' : 'Ctrl+Shift+Up',
scrollDown: isMac ? 'Cmd+Shift+Down' : 'Ctrl+Shift+Down',
emergencyErase: isMac ? 'Cmd+Shift+E' : 'Ctrl+Shift+E',
};
}
function updateGlobalShortcuts(keybinds, mainWindow, sendToRenderer, geminiSessionRef) {
console.log('Updating global shortcuts with:', keybinds);
// Unregister all existing shortcuts
globalShortcut.unregisterAll();
const primaryDisplay = screen.getPrimaryDisplay();
const { width, height } = primaryDisplay.workAreaSize;
const moveIncrement = Math.floor(Math.min(width, height) * 0.1);
// Register window movement shortcuts
const movementActions = {
moveUp: () => {
if (!mainWindow.isVisible()) return;
const [currentX, currentY] = mainWindow.getPosition();
mainWindow.setPosition(currentX, currentY - moveIncrement);
},
moveDown: () => {
if (!mainWindow.isVisible()) return;
const [currentX, currentY] = mainWindow.getPosition();
mainWindow.setPosition(currentX, currentY + moveIncrement);
},
moveLeft: () => {
if (!mainWindow.isVisible()) return;
const [currentX, currentY] = mainWindow.getPosition();
mainWindow.setPosition(currentX - moveIncrement, currentY);
},
moveRight: () => {
if (!mainWindow.isVisible()) return;
const [currentX, currentY] = mainWindow.getPosition();
mainWindow.setPosition(currentX + moveIncrement, currentY);
},
};
// Register each movement shortcut
Object.keys(movementActions).forEach(action => {
const keybind = keybinds[action];
if (keybind) {
try {
globalShortcut.register(keybind, movementActions[action]);
console.log(`Registered ${action}: ${keybind}`);
} catch (error) {
console.error(`Failed to register ${action} (${keybind}):`, error);
}
}
});
// Register toggle visibility shortcut
if (keybinds.toggleVisibility) {
try {
globalShortcut.register(keybinds.toggleVisibility, () => {
if (mainWindow.isVisible()) {
mainWindow.hide();
} else {
mainWindow.showInactive();
}
});
console.log(`Registered toggleVisibility: ${keybinds.toggleVisibility}`);
} catch (error) {
console.error(`Failed to register toggleVisibility (${keybinds.toggleVisibility}):`, error);
}
}
// Register toggle click-through shortcut
if (keybinds.toggleClickThrough) {
try {
globalShortcut.register(keybinds.toggleClickThrough, () => {
mouseEventsIgnored = !mouseEventsIgnored;
if (mouseEventsIgnored) {
mainWindow.setIgnoreMouseEvents(true, { forward: true });
console.log('Mouse events ignored');
} else {
mainWindow.setIgnoreMouseEvents(false);
console.log('Mouse events enabled');
}
mainWindow.webContents.send('click-through-toggled', mouseEventsIgnored);
});
console.log(`Registered toggleClickThrough: ${keybinds.toggleClickThrough}`);
} catch (error) {
console.error(`Failed to register toggleClickThrough (${keybinds.toggleClickThrough}):`, error);
}
}
// Register next step shortcut (either starts session or takes screenshot based on view)
if (keybinds.nextStep) {
try {
globalShortcut.register(keybinds.nextStep, async () => {
console.log('Next step shortcut triggered');
try {
// Determine the shortcut key format
const isMac = process.platform === 'darwin';
const shortcutKey = isMac ? 'cmd+enter' : 'ctrl+enter';
// Use the new handleShortcut function
mainWindow.webContents.executeJavaScript(`
cheatingDaddy.handleShortcut('${shortcutKey}');
`);
} catch (error) {
console.error('Error handling next step shortcut:', error);
}
});
console.log(`Registered nextStep: ${keybinds.nextStep}`);
} catch (error) {
console.error(`Failed to register nextStep (${keybinds.nextStep}):`, error);
}
}
// Register previous response shortcut
if (keybinds.previousResponse) {
try {
globalShortcut.register(keybinds.previousResponse, () => {
console.log('Previous response shortcut triggered');
sendToRenderer('navigate-previous-response');
});
console.log(`Registered previousResponse: ${keybinds.previousResponse}`);
} catch (error) {
console.error(`Failed to register previousResponse (${keybinds.previousResponse}):`, error);
}
}
// Register next response shortcut
if (keybinds.nextResponse) {
try {
globalShortcut.register(keybinds.nextResponse, () => {
console.log('Next response shortcut triggered');
sendToRenderer('navigate-next-response');
});
console.log(`Registered nextResponse: ${keybinds.nextResponse}`);
} catch (error) {
console.error(`Failed to register nextResponse (${keybinds.nextResponse}):`, error);
}
}
// Register scroll up shortcut
if (keybinds.scrollUp) {
try {
globalShortcut.register(keybinds.scrollUp, () => {
console.log('Scroll up shortcut triggered');
sendToRenderer('scroll-response-up');
});
console.log(`Registered scrollUp: ${keybinds.scrollUp}`);
} catch (error) {
console.error(`Failed to register scrollUp (${keybinds.scrollUp}):`, error);
}
}
// Register scroll down shortcut
if (keybinds.scrollDown) {
try {
globalShortcut.register(keybinds.scrollDown, () => {
console.log('Scroll down shortcut triggered');
sendToRenderer('scroll-response-down');
});
console.log(`Registered scrollDown: ${keybinds.scrollDown}`);
} catch (error) {
console.error(`Failed to register scrollDown (${keybinds.scrollDown}):`, error);
}
}
// Register emergency erase shortcut
if (keybinds.emergencyErase) {
try {
globalShortcut.register(keybinds.emergencyErase, () => {
console.log('Emergency Erase triggered!');
if (mainWindow && !mainWindow.isDestroyed()) {
mainWindow.hide();
if (geminiSessionRef.current) {
geminiSessionRef.current.close();
geminiSessionRef.current = null;
}
sendToRenderer('clear-sensitive-data');
setTimeout(() => {
const { app } = require('electron');
app.quit();
}, 300);
}
});
console.log(`Registered emergencyErase: ${keybinds.emergencyErase}`);
} catch (error) {
console.error(`Failed to register emergencyErase (${keybinds.emergencyErase}):`, error);
}
}
}
function setupWindowIpcHandlers(mainWindow, sendToRenderer, geminiSessionRef) {
ipcMain.on('view-changed', (event, view) => {
if (view !== 'assistant' && !mainWindow.isDestroyed()) {
mainWindow.setIgnoreMouseEvents(false);
}
});
ipcMain.handle('window-minimize', () => {
if (!mainWindow.isDestroyed()) {
mainWindow.minimize();
}
});
ipcMain.on('update-keybinds', (event, newKeybinds) => {
if (!mainWindow.isDestroyed()) {
updateGlobalShortcuts(newKeybinds, mainWindow, sendToRenderer, geminiSessionRef);
}
});
ipcMain.handle('toggle-window-visibility', async event => {
try {
if (mainWindow.isDestroyed()) {
return { success: false, error: 'Window has been destroyed' };
}
if (mainWindow.isVisible()) {
mainWindow.hide();
} else {
mainWindow.showInactive();
}
return { success: true };
} catch (error) {
console.error('Error toggling window visibility:', error);
return { success: false, error: error.message };
}
});
function animateWindowResize(mainWindow, targetWidth, targetHeight, layoutMode) {
return new Promise(resolve => {
// Check if window is destroyed before starting animation
if (mainWindow.isDestroyed()) {
console.log('Cannot animate resize: window has been destroyed');
resolve();
return;
}
// Clear any existing animation
if (resizeAnimation) {
clearInterval(resizeAnimation);
resizeAnimation = null;
}
const [startWidth, startHeight] = mainWindow.getSize();
// If already at target size, no need to animate
if (startWidth === targetWidth && startHeight === targetHeight) {
console.log(`Window already at target size for ${layoutMode} mode`);
resolve();
return;
}
console.log(`Starting animated resize from ${startWidth}x${startHeight} to ${targetWidth}x${targetHeight}`);
windowResizing = true;
mainWindow.setResizable(true);
const frameRate = 60; // 60 FPS
const totalFrames = Math.floor(RESIZE_ANIMATION_DURATION / (1000 / frameRate));
let currentFrame = 0;
const widthDiff = targetWidth - startWidth;
const heightDiff = targetHeight - startHeight;
resizeAnimation = setInterval(() => {
currentFrame++;
const progress = currentFrame / totalFrames;
// Use easing function (ease-out)
const easedProgress = 1 - Math.pow(1 - progress, 3);
const currentWidth = Math.round(startWidth + widthDiff * easedProgress);
const currentHeight = Math.round(startHeight + heightDiff * easedProgress);
if (!mainWindow || mainWindow.isDestroyed()) {
clearInterval(resizeAnimation);
resizeAnimation = null;
windowResizing = false;
return;
}
mainWindow.setSize(currentWidth, currentHeight);
// Re-center the window during animation
const primaryDisplay = screen.getPrimaryDisplay();
const { width: screenWidth } = primaryDisplay.workAreaSize;
const x = Math.floor((screenWidth - currentWidth) / 2);
const y = 0;
mainWindow.setPosition(x, y);
if (currentFrame >= totalFrames) {
clearInterval(resizeAnimation);
resizeAnimation = null;
windowResizing = false;
// Check if window is still valid before final operations
if (!mainWindow.isDestroyed()) {
mainWindow.setResizable(false);
// Ensure final size is exact
mainWindow.setSize(targetWidth, targetHeight);
const finalX = Math.floor((screenWidth - targetWidth) / 2);
mainWindow.setPosition(finalX, 0);
}
console.log(`Animation complete: ${targetWidth}x${targetHeight}`);
resolve();
}
}, 1000 / frameRate);
});
}
ipcMain.handle('update-sizes', async event => {
try {
if (mainWindow.isDestroyed()) {
return { success: false, error: 'Window has been destroyed' };
}
// Get current view and layout mode from renderer
let viewName, layoutMode;
try {
viewName = await event.sender.executeJavaScript('cheatingDaddy.getCurrentView()');
layoutMode = await event.sender.executeJavaScript('cheatingDaddy.getLayoutMode()');
} catch (error) {
console.warn('Failed to get view/layout from renderer, using defaults:', error);
viewName = 'main';
layoutMode = 'normal';
}
console.log('Size update requested for view:', viewName, 'layout:', layoutMode);
let targetWidth, targetHeight;
// Determine base size from layout mode
const baseWidth = layoutMode === 'compact' ? 700 : 900;
const baseHeight = layoutMode === 'compact' ? 500 : 600;
// Adjust height based on view
switch (viewName) {
case 'main':
targetWidth = baseWidth;
targetHeight = layoutMode === 'compact' ? 320 : 400;
break;
case 'customize':
case 'settings':
targetWidth = baseWidth;
targetHeight = layoutMode === 'compact' ? 700 : 800;
break;
case 'help':
targetWidth = baseWidth;
targetHeight = layoutMode === 'compact' ? 650 : 750;
break;
case 'history':
targetWidth = baseWidth;
targetHeight = layoutMode === 'compact' ? 650 : 750;
break;
case 'assistant':
case 'onboarding':
default:
targetWidth = baseWidth;
targetHeight = baseHeight;
break;
}
const [currentWidth, currentHeight] = mainWindow.getSize();
console.log('Current window size:', currentWidth, 'x', currentHeight);
// If currently resizing, the animation will start from current position
if (windowResizing) {
console.log('Interrupting current resize animation');
}
await animateWindowResize(mainWindow, targetWidth, targetHeight, `${viewName} view (${layoutMode})`);
return { success: true };
} catch (error) {
console.error('Error updating sizes:', error);
return { success: false, error: error.message };
}
});
}
module.exports = {
createWindow,
getDefaultKeybinds,
updateGlobalShortcuts,
setupWindowIpcHandlers,
};

15
src/utils/windowResize.js Normal file
View File

@ -0,0 +1,15 @@
export async function resizeLayout() {
try {
if (window.require) {
const { ipcRenderer } = window.require('electron');
const result = await ipcRenderer.invoke('update-sizes');
if (result.success) {
console.log('Window resized for current view');
} else {
console.error('Failed to resize window:', result.error);
}
}
} catch (error) {
console.error('Error resizing window:', error);
}
}