Early Access User: Feedback after 2 hours using the platform

Overall, my experience with the platform has been positive so far. The generation and preview features are working well, and I’m looking forward to testing the interface further in the coming weeks by integrating it with a backend project I’m currently working on.

However, I do have a few observations and suggestions for improvement:

  1. Token Context Limitations
    I noticed that after processing around 350,000 tokens, the code generation started returning errors. Specifically, these errors were related to authentication with Google services. While I understand that token limits depend on the underlying LLM model, it would be helpful to know if there’s any summarization or optimization strategy being applied to manage large contexts. If not, perhaps this could be an area for improvement to handle larger projects more effectively.

  2. Version History and Chat Integration
    One feature that would significantly enhance usability is the ability to navigate directly to the chat session that generated a specific version in the version history. For example, when reviewing past versions, having an option to “jump to the chat” associated with that version would provide better context and make it easier to revisit or refine earlier work. This would streamline the workflow and improve the overall user experience.

Thank you for the opportunity to provide feedback as an early access user. I’m excited to see how the platform evolves and am happy to assist with further testing or insights if needed.

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board
💡

Feature Request

Date

5 months ago

Author

Lenyn Alcántara

Subscribe to post

Get notified by email when there are changes.