We are excited to announce the next version of Ravenverse which supports concurrent training of a Requester Graph across multipl

13 Jun 2023, 13:02
We are excited to announce the next version of Ravenverse which supports concurrent training of a Requester Graph across multiple participating Providers.

Same news in other sources

2
Raven Protocol
Raven ProtocolRAVEN #6029
Telegram
13 Jun 2023, 13:41
Concurrent Training of a Requester Graph Across Multiple Participating Providers 🚀 Release Notes 🚀 We are excited to announce the next version of Ravenverse which supports concurrent training of a Requester Graph across multiple participating Providers. Requester Side 1. The number of participants required to compute a graph is now determined and set automatically by our backend based on the complexity of the deep learning graph, operations, weight size and a bunch of other parameters. Requesters can no longer set this number manually in the ravop.execute() function. 2. Example scripts based on fine-tuning of GPT-2 model has been added in the Ravenverse GitHub Repository. The Requesters can first generate a “.pt” (pytorch) model file and there and load it up for distributed training in the Ravenverse. As an example, we have added a Poem Generator GPT which can be trained to write poems based on different emotions like fear, anticipation, mystery, horror etc. Additionally, there’s a Number-Sorting application using GPT that takes in an input vector and simply sorts it. 3. Novel optimisation technique implemented for subgraph formation. Significant speedup observed in graph compilation time. Provider Side 4. Improved ping-pong refresh rate for more rapid assignment of subgraphs to providers. 5. New subgraph computation mechanism with reduced payload size for more efficient transfer of results. Massive speedup observed. 6. Backup provider support for maintaining graph progress in case of disconnect cases. 7. GPU benchmarking metrics revised based on graph complexity. Our Libraries: Ravpy: Ravop: RavDL: You can find the documentation for each library in the respective repo Readme files. Please try them out and let us know if you run into any problems. Raven Protocol GitHub: Enjoy the new release! ❤️ The Raven Protocol Team Retweet: Reshare: Like:
Concurrent Training of a Requester Graph Across Multiple Participating Providers. Release Notes.
Concurrent Training of a Requester Graph Across Multiple Participating Providers 🚀 Release Notes 🚀 We are excited to announce the next version of Ravenverse which supports concurrent training of a Requester Graph across multiple participating Providers. Requester Side 1. The number of participants required to compute a graph is now determined and set automatically by our backend based on the complexity of the deep learning graph, operations, weight size and a bunch of other parameters. Requesters can no longer set this number manually in the ravop.execute() function. 2. Example scripts based on fine-tuning of GPT-2 model has been added in the Ravenverse GitHub Repository. The Requesters can first generate a “.pt” (pytorch) model file and there and load it up for distributed training in the Ravenverse. As an example, we have added a Poem Generator GPT which can be trained to write poems based on different emotions like fear, anticipation, mystery, horror etc. Additionally, there’s a Number-Sorting application using GPT that takes in an input vector and simply sorts it. 3. Novel optimisation technique implemented for subgraph formation. Significant speedup observed in graph compilation time. Provider Side 4. Improved ping-pong refresh rate for more rapid assignment of subgraphs to providers. 5. New subgraph computation mechanism with reduced payload size for more efficient transfer of results. Massive speedup observed. 6. Backup provider support for maintaining graph progress in case of disconnect cases. 7. GPU benchmarking metrics revised based on graph complexity. Our Libraries: Ravpy: https://pypi.org/project/ravpy/ Ravop: https://pypi.org/project/ravop/ RavDL: https://pypi.org/project/ravdl/ You can find the documentation for each library in the respective repo Readme files. Please try them out and let us know if you run into any problems. Raven Protocol GitHub: https://github.com/ravenprotocol Enjoy the new release! ❤️ — The Raven Protocol Team Retweet: https://twitter.com/raven_protocol/status/1668604775062011905 Reshare: https://www.linkedin.com/posts/ravenprotocol_concurrent-training-of-a-requester-graph-activity-7074373269776146432-AdX8 Like: https://medium.com/ravenprotocol/concurrent-training-of-a-requester-graph-across-multiple-participating-providers-80bb6a934bfb
Raven Protocol
Raven ProtocolRAVEN #6029
Twitter
13 Jun 2023, 13:02
Our Libraries: Ravpy: Ravop: RavDL: You can find the documentation for each library in the respective repo Readme files. Please try them out and let us know if you run into any problems.
Our Libraries:. Ravpy:. Ravop:. RavDL:. You can find the documentation for each library in the respective repo Readme files.
Our Libraries: Ravpy: https://t.co/5NkSAFM4fi Ravop: https://t.co/FoV9bSRMhH RavDL: https://t.co/BvgdooSYHN You can find the documentation for each library in the respective repo Readme files. Please try them out and let us know if you run into any problems.