Summarize your
writing experience

Comming soonLearn more

What is "Summarize"
How does it help me?

Summarize is a next *gen* word processor which lets you go fazzt ⏩

While the whole "Next Gen" thing might sound a little buzz wordy but there exists nothing to compare Summarize to. This is just a different type of word processor / knowledge-base which hasn't existed till now. Summarize lets you write quicker, organize/search faster and also summarize (as you can tell from the name) web content or documents.

Trendy Pants and Shoes

What makes it different ?

Summarize's local ML engine provides abilities which weren't possible before by any other editor. All of these feautres while being able to use your idle* GPU for offloading the computation making text editing a responsive and private experience.

Word Prediction

Sentence Prediction

Paragraph Genration

Rich editing

Markdown Support

Automatic Document Categories

Automatic Document tagging

Blazing fast search

Automatic Document Backlinks

Summary Generation

Related Content Summary/ Suggestion

Document Entity Recognition

And much much more ...

Okay, but why is it so great?

Lets have some solid data about some stats that we have found so far as per general purpose usage of average people and how much summarize would help.

30-50%

Time saved while writing

5-12%

Less time Searching

Upto 75%

Time saved reading if summarized

How does Summarize built?

Summarize desktop is built on top of its ML engine (which basically manages all the models and ML inference locally) built in Rust. Some of its base stack is :

- Tauri

- Tensorflow / Pytorch

- Solid js

Most of the team's time is spent on getting ML models made in tensorflow and pytorch fine tuned for out own use case, just right. Summarize will ship with multiple models which specialize in one perticular task or multiple based on implementation. The long list of feautres available in summarie is possible due to managing of GPU and system memory.


Previously these types of models were only accessible on the server side as running them on edge is a big hassle plus has performance implications. In recent years due to rise in GPU avaiblility in Desktop as well as mobile devices, we are now able to move this to client side. Also, worry not, for those who don't want to use precious battery life in laptops running dedicated GPU or doesn't have access to one, we plan on releasing server version.

Do not miss any updates.
Subscribe to the newsletter