用VB编写的去重程序,可以实现对TXT文件的去重复数据的处理,内含源代码。-Prepared to use VB to re-process, can be achieved on the TXT file to deal with duplicate data, including source code.
SHOW FULL COLUMNS FROM `jrk_downrecords` [ RunTime:0.001691s ]
SELECT `a`.`aid`,`a`.`title`,`a`.`create_time`,`m`.`username` FROM `jrk_downrecords` `a` INNER JOIN `jrk_member` `m` ON `a`.`uid`=`m`.`id` WHERE `a`.`status` = 1 GROUP BY `a`.`aid` ORDER BY `a`.`create_time` DESC LIMIT 10 [ RunTime:0.140343s ]
SHOW FULL COLUMNS FROM `jrk_tagrecords` [ RunTime:0.001483s ]
SELECT * FROM `jrk_tagrecords` WHERE `status` = 1 ORDER BY `num` DESC LIMIT 20 [ RunTime:0.003298s ]
SHOW FULL COLUMNS FROM `jrk_member` [ RunTime:0.001926s ]
SELECT `id`,`username`,`userhead`,`usertime` FROM `jrk_member` WHERE `status` = 1 ORDER BY `usertime` DESC LIMIT 10 [ RunTime:0.005728s ]
SHOW FULL COLUMNS FROM `jrk_searchrecords` [ RunTime:0.001634s ]
SELECT * FROM `jrk_searchrecords` WHERE `status` = 1 ORDER BY `num` DESC LIMIT 5 [ RunTime:0.005701s ]
SELECT aid,title,count(aid) as c FROM `jrk_downrecords` GROUP BY `aid` ORDER BY `c` DESC LIMIT 10 [ RunTime:0.016363s ]
SHOW FULL COLUMNS FROM `jrk_articles` [ RunTime:0.001492s ]
UPDATE `jrk_articles` SET `hits` = 1 WHERE `id` = 213863 [ RunTime:0.001540s ]