gzip

how to search for a particular string from a .gz file?

点点圈 提交于 2020-02-19 09:35:03
问题 I want to search for a particular string from a .gz file containing a text file without extracting in linux terminal. I know how to search for a string from a text file using grep "text to search" ./myfile.txt . But how to make it work for .gz files? 回答1: gunzip -c mygzfile.gz | grep "string to be searched" But this would only work if the .gz file contains text file which is true in your case. 回答2: You can use zgrep. Usage is similar to grep . zgrep "pattern" file.gz From the man page's

how to search for a particular string from a .gz file?

一个人想着一个人 提交于 2020-02-19 09:33:11
问题 I want to search for a particular string from a .gz file containing a text file without extracting in linux terminal. I know how to search for a string from a text file using grep "text to search" ./myfile.txt . But how to make it work for .gz files? 回答1: gunzip -c mygzfile.gz | grep "string to be searched" But this would only work if the .gz file contains text file which is true in your case. 回答2: You can use zgrep. Usage is similar to grep . zgrep "pattern" file.gz From the man page's

Linux 压缩和解压缩

萝らか妹 提交于 2020-02-13 23:43:30
Linux 压缩和解压缩 最早的:compress/uncompresse 文件名后缀:.Z 然后是:gzip/gunzip 文件名后缀:.gz 很久一段gzip是linux的标准压缩和解压缩算法了 然后是:bzip2/bunizip2 文件名后缀:.bz2 大文件的压缩比比gzip大一点,但是小文件的压缩比还没有gzip大,所以没能撼动gzip的地位 然后是:xz/unxz 文件名后缀:.xz 压缩比显著提高,撼动了gzip的地位 lzma/unlama 文件名后缀:.lzma 最通用的(所以操作系统都支持):zip/unzip Linux 归档:上述压缩工具,只能对文件压缩,不能对目录压缩。要想对目录压缩,必须要归档后,再对归档文件压缩。 tar,cpio gzip/gunzip/zcat 1,压缩: gzip file 压缩完成后,会自动删除原文件 # ll -h messages -rw-------. 1 root root 915K Feb 11 22:05 messages [root@localhost ~]# gzip messages [root@localhost ~]# ll -h messages.gz -rw-------. 1 root root 167K Feb 11 22:05 messages.gz 2,解压缩: gzip -d或者gunzip

Linux基础命令---gzip

浪子不回头ぞ 提交于 2020-02-13 05:11:08
gzip gzip通过Lempel-ziv算法来压缩文件,压缩的时候保留每个文件的所有者、权限、修改时间。对于符号链接,gzip将会忽略它。 如果压缩的文件名对其文件系统来说太长,则gzip将截断它。Gzip试图只截断文件名中超过3个字符的部分。(部分由点分隔。)如果名称仅由小部件组成,最长的部分将被截断。例如,如果文件名限制为14个字符,则“gzip.msdos.exe”压缩为“gzi.msd.exe.gz”。在没有文件名长度限制的系统中,名称不会被截断。 默认情况下,gzip将原始文件名和时间戳保存在压缩文件中。这些在使用“-N”选项解压缩文件时使用。当压缩文件名被截断或文件传输后没有保留时间戳时,这是非常有用的。压缩文件可以使用“gzip –d”或“gunzip”或“zcat”恢复到它们的原始形式。如果保存在压缩文件中的原始名称不适合其文件系统,则从原始文件中构造新名称以使其合法。 gunzip在其命令行中获取一个文件列表,并替换其名称以.gz、-z、-z、_z或.z结尾的每个文件,该文件以正确的魔术号开头,文件的未压缩文件没有原来的扩展名。gunzip还将特殊的扩展名.tgz和.taz分别识别为.tar.gz和.tar.z的缩写。压缩时,gzip在必要时使用.tgz扩展名,而不是截断扩展名为.tar的文件。 gunzip目前可以解压缩由gzip,zip,compress

Python post、get百度(登陆)

China☆狼群 提交于 2020-02-12 05:48:58
python get百度获得搜索结果 # -*- coding: cp936 -*- import urllib2,urllib,sys,io """ 使用GET在百度搜索引擎上查询 此例演示如何生成GET串,并进行请求. """ url = "http://www.baidu.com/s" search = [('w','codemo')] getString = url + "?" + urllib.urlencode(search) req = urllib2.Request(getString) fd = urllib2.urlopen(req) baiduResponse="" while 1: data= fd.read(1024) if not len(data): break baiduResponse+=data fobj=open("baidu.html",'w') fobj.write(baiduResponse) fobj.close() python 百度登录 import sys, urllib2,gzip,StringIO params = "charset=utf-8&codestring=&token=96f08093303c5c0b3f4a62acb8c04898&isPhone=false&index=0&u=http%3A%2F%2Fwww

Nginx学习(六) nginx的gzip配置

女生的网名这么多〃 提交于 2020-02-08 14:35:36
gzip配置的常用参数 gzip on | off ; #是否开启gzip gzip_buffers 32 4K | 16 8K #缓冲(压缩在内存中缓冲几块? 每块多大?) gzip_comp_level [ 1-9 ] #推荐6 压缩级别(级别越高,压的越小,越浪费CPU计算资源) gzip_disable #正则匹配UA 什么样的Uri不进行gzip gzip_min_length 200 # 开始压缩的最小长度(再小就不要压缩了,意义不在) gzip_http_version 1.0 | 1.1 # 开始压缩的http协议版本(可以不设置,目前几乎全是1.1协议) gzip_proxied # 设置请求者代理服务器,该如何缓存内容 gzip_types text/plain application/xml # 对哪些类型的文件用压缩 如txt,xml,html ,css gzip_vary on | off # 是否传输gzip压缩标志 * 注意: 图片/视频文件等这样的二进制文件,不必压缩,因为压缩率比较小,而且压缩也是耗费CPU资源的 比较小的文件不必压缩,因为压缩后可能大小反而会变大 来源: CSDN 作者: 码农-文若书生 链接: https://blog.csdn.net/u011943534/article/details/104220875

Do most shared hosts handle gzipped files?

我怕爱的太早我们不能终老 提交于 2020-02-06 05:26:31
问题 I get them theoretically, but I'm grappling with gzipping files in practice. How should I go about gzip compressing my files, and what needs to be done in order to use them on a shared host? Would the following work? RewriteEngine On RewriteBase / RewriteCond %{HTTP:Accept-Encoding} .*gzip.* RewriteRule ^/(.*)\.js$ /$1.js.gz [L] RewriteRule ^/(.*)\.css$ /$1.css.gz [L] AddEncoding x-gzip text.gz 回答1: You're probably going to have trouble with the MIME type of the decompressed content still

基于Python的网页文档处理脚本实现

和自甴很熟 提交于 2020-02-06 01:26:15
  嵌入式web服务器不同于传统服务器,web需要转换成数组格式保存在flash中,才方便lwip网络接口的调用,最近因为业务需求,需要频繁修改网页,每次的压缩和转换就是个很繁琐的过程,因此我就有了利用所掌握的知识,利用python编写个能够批量处理网页文件,压缩并转换成数组的脚本。   脚本运行背景(后续版本兼容): Python 3.5.1(下载、安装、配置请参考网上教程) node.js v4.4.7, 安装uglifyjs管理包,支持js文件非文本压缩 uglifyjs 用来压缩JS文件的引擎,具体安装可参考 http://www.zhangxinxu.com/wordpress/2013/01/uglifyjs-compress-js/ 具体实现代码如下: #/usr/bin/python import os import binascii import shutil from functools import partial import re import gzip #创建一个新文件夹 def mkdir(path): path=path.strip() isExists=os.path.exists(path) #判断文件夹是否存在,不存在则创建 if not isExists: os.makedirs(path) print(path+' 创建成功') else:

Single bundle with minification vs multiple files over http/2

♀尐吖头ヾ 提交于 2020-02-03 10:04:41
问题 What is the general recommendation when it comes to CSS and JS bundling: Is it better to bundle everything into one file or is it better to serve multiple files? I personally say that multiple files are better, especially with http/2, but there're good reasons for bundles: Minification and gzip have better results when everything is in one file, because of all the recurrences you typically have when writing lots of code. Serving multiple files on the other side improves caching and allows

Single bundle with minification vs multiple files over http/2

烈酒焚心 提交于 2020-02-03 10:04:31
问题 What is the general recommendation when it comes to CSS and JS bundling: Is it better to bundle everything into one file or is it better to serve multiple files? I personally say that multiple files are better, especially with http/2, but there're good reasons for bundles: Minification and gzip have better results when everything is in one file, because of all the recurrences you typically have when writing lots of code. Serving multiple files on the other side improves caching and allows