Overall, my experience with the platform has been positive so far. The generation and preview features are working well, and I’m looking forward to testing the interface further in the coming weeks by integrating it with a backend project I’m currently working on.
However, I do have a few observations and suggestions for improvement:
Token Context Limitations
I noticed that after processing around 350,000 tokens, the code generation started returning errors. Specifically, these errors were related to authentication with Google services. While I understand that token limits depend on the underlying LLM model, it would be helpful to know if there’s any summarization or optimization strategy being applied to manage large contexts. If not, perhaps this could be an area for improvement to handle larger projects more effectively.
Version History and Chat Integration
One feature that would significantly enhance usability is the ability to navigate directly to the chat session that generated a specific version in the version history. For example, when reviewing past versions, having an option to “jump to the chat” associated with that version would provide better context and make it easier to revisit or refine earlier work. This would streamline the workflow and improve the overall user experience.
Thank you for the opportunity to provide feedback as an early access user. I’m excited to see how the platform evolves and am happy to assist with further testing or insights if needed.
Please authenticate to join the conversation.
In Review
Feature Request
5 months ago
Lenyn Alcántara
Get notified by email when there are changes.
In Review
Feature Request
5 months ago
Lenyn Alcántara
Get notified by email when there are changes.