Tree structured vector quantization based technique for speech compression

P. Kanawade, S. Gundal
{"title":"Tree structured vector quantization based technique for speech compression","authors":"P. Kanawade, S. Gundal","doi":"10.1109/ICDMAI.2017.8073524","DOIUrl":null,"url":null,"abstract":"In this paper, the Tree-Structured Vector Quantization (TSVQ) method for proficient speech compression is presented. Efficient utilization of memory is always needed when analog-encoded or digitized data such as image, audio, videos, portable files are need to store and/or convey to digital channels. Compression offers betterments with storage requirements while transmitting the encoded signals with lossy and lossless techniques. Lossy compression is always intended for compression of high volume data with Scalar Quantization (SQ) and Vector Quantization (VQ). The Tree based VQ method is used with hieratically organized binary sequences of codeword of data (speech) for compression with reduced and minimized arithmetic calculation requirements. Speech compression has been gained by compressed-codebook coefficients and structured in binary fashion. The quantization noise ratio with signal power is obtained efficiently around less than 1.082 dB. Shared codebook method described in this TSVQ algorithm achieves 3.6 reduced storage requirements of factor 5 to 3.","PeriodicalId":368507,"journal":{"name":"2017 International Conference on Data Management, Analytics and Innovation (ICDMAI)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Conference on Data Management, Analytics and Innovation (ICDMAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDMAI.2017.8073524","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

In this paper, the Tree-Structured Vector Quantization (TSVQ) method for proficient speech compression is presented. Efficient utilization of memory is always needed when analog-encoded or digitized data such as image, audio, videos, portable files are need to store and/or convey to digital channels. Compression offers betterments with storage requirements while transmitting the encoded signals with lossy and lossless techniques. Lossy compression is always intended for compression of high volume data with Scalar Quantization (SQ) and Vector Quantization (VQ). The Tree based VQ method is used with hieratically organized binary sequences of codeword of data (speech) for compression with reduced and minimized arithmetic calculation requirements. Speech compression has been gained by compressed-codebook coefficients and structured in binary fashion. The quantization noise ratio with signal power is obtained efficiently around less than 1.082 dB. Shared codebook method described in this TSVQ algorithm achieves 3.6 reduced storage requirements of factor 5 to 3.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于树结构矢量量化的语音压缩技术
提出了一种高效语音压缩的树结构矢量量化(TSVQ)方法。当模拟编码或数字化数据(如图像、音频、视频、便携式文件)需要存储和/或传输到数字通道时,总是需要有效地利用内存。压缩提供了更好的存储要求,同时以有损和无损技术传输编码信号。有损压缩通常用于使用标量量化(SQ)和矢量量化(VQ)压缩大容量数据。基于树的VQ方法用于分层组织的数据(语音)码字二进制序列进行压缩,减少和最小化了算法计算需求。语音压缩是通过压缩码本系数并以二进制方式结构化来实现的。在小于1.082 dB左右有效地获得了量化噪声与信号功率的比值。该TSVQ算法所描述的共享码本方法实现了3.6倍的存储需求降低,存储需求降低了5到3倍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Keynote speakers The impact of corporate governance and firm performance on chief executive officer's compensation: Evidence from central state owned enterprises in India A novel method for evaluating intercept factor of solar line concentrator system Process trees & service chains can serve us to mitigate zero day attacks better Factors influencing herding behavior among Indian stock investors
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1