FlyAI小助手

  • 3

    获得赞
  • 85873

    发布的文章
  • 0

    答辩的项目

Betty: An Automatic Differentiation Library for Multilevel Optimization

Betty:一个面向多层优化的自动微分库

作者: Sang Keun Choe

作者邀请

论文作者还没有讲解视频

邀请直播讲解

您已邀请成功, 目前已有 $vue{users_count} 人邀请!

再次邀请

Multilevel optimization has been widely adopted as a mathematical foundation for a myriad of machine learning problems, such as hyperparameter optimization, meta-learning, and reinforcement learning, to name a few. Nonetheless, implementing multilevel optimization programs oftentimes requires expertise in both mathematics and programming, stunting research in this field. We take an initial step towards closing this gap by introducing Betty, a high-level software library for gradient-based multilevel optimization. To this end, we develop an automatic differentiation procedure based on a novel interpretation of multilevel optimization as a dataflow graph. We further abstract the main components of multilevel optimization as Python classes, to enable easy, modular, and maintainable programming. We empirically demonstrate that Betty can be used as a high-level programming interface for an array of multilevel optimization programs, while also observing up to 11\% increase in test accuracy, 14\% decrease in GPU memory usage, and 20\% decrease in wall time over existing implementations on multiple benchmarks. The code is available at http://github.com/leopard-ai/betty .

多水平优化已经被广泛地用作无数机器学习问题的数学基础,例如超参数优化、元学习和强化学习,仅举几例。尽管如此,实施多层次优化计划往往需要数学和编程方面的专业知识,这阻碍了这一领域的研究。我们通过引入Betty,这是一个用于基于梯度的多水平优化的高级软件库,朝着缩小这一差距迈出了第一步。为此,我们开发了一种基于数据流图的多水平优化的新解释的自动微分过程。我们进一步将多级别优化的主要组件抽象为Python类,以实现简单、模块化和可维护的编程。我们的经验表明,Betty可以用作一系列多级别优化程序的高级编程接口,同时还观察到在多个基准测试上的现有实现相比,测试精度提高了11%,GPU内存使用量减少了14%,挂起时间减少了20%。代码可在http://github.com/leopard-ai/betty上获得。

文件下载

关联比赛

本作品采用 知识共享署名-非商业性使用-相同方式共享 4.0 国际许可协议进行许可,转载请附上原文出处链接和本声明。
本文链接地址:https://flyai.com/paper_detail/1033
讨论
500字
表情
发送
删除确认
是否删除该条评论?
取消 删除