I'm trying to convert a hex string to decimal string byte by byte, and then pass it to a Struct.unpack function to convert to double. However, I get the wrong number even though the exact same string value in a string literal will get the right number. Here is the code:
result:
I don’t understand! What’s going on? asked 20 Mar ‘14, 11:40 YXI edited 20 Mar ‘14, 12:17 Hadriel |
One Answer:
Your for-loop is constructing a Lua string, containing the characters exactly as printed. In other words, you've constructed a string of the character " But this:
Does not construct that same string. Why? Because those " In other words, before Try doing this and you'll see what I mean:
You'll probably see gibberish on your screen, because it's a string of bytes which may or may not be printable characters. But it's not the ascii character " This is just like in most any programming language I can think of - if you escape things inside a string literal in the source code/script, the compiler/interpreter knows you mean to handle them as something else. answered 20 Mar '14, 12:31 Hadriel
Content on this site is licensed under a Creative Commons Attribution Share Alike 3.0 license.
|
BTW, if you can tell me what it is you're trying to accomplish - the bigger picture - perhaps I can help you get there faster. But if you'd like to get there on your own that's cool too. :)
Well, the big picture is I need to convert bytes of different lengths to various numbers (unsigned int, int, float, double, etc).
When I have 4 bytes or less it's more or less straight forward. When I have more that 4 bytes, I've been using the UInt64/Int64 to get the value from the bytes, pack it first and then unpack to convert to the type I want. However, when converting 8 bytes to double, it doesn't work right in Windows. Following your post to my question, I decided to skip the pack step and go directly to unpack, passing to it a string of the bytes' values in decimals. Obviously I thought it's a normal string of numbers separated by "\". Now I realize it's not the case. How did you get the decimal number sequence with escape characters from hex? If you have an easier way to do what I need to do, please share.
"How did you get the decimal number sequence with escape characters from hex?"
I used my calculator to convert the hex value to a decimal value. Mac's built-in calculator in programmer view mode is quite useful. :)
But obviously you wouldn't do that in a Lua script - you'd use
string.byte()
in a for-loop or as an argument tostring.gsub()
. But in wireshark you could just useStruct.fromhex()
.But that's assuming you started out with a string of hex-ascii characters. Is that the case for you? Where did this string come from? If it's from a packet's contents, why do you need to use Lua to convert them instead of using the provided tvb/tvbrange or tree functions?
I cannot use tvb/tvbrange functions because my values are stored in disjoint bytes. For example, a 8-byte value can be stored in offset 0-4 and then 16-20. I could use byteArray:tvb() to create a new tvb using these bytes. However, whenever I create a new tvb this way, a new tab is added to the packet bytes pane in the wireshark GUI. And since I have lots of these tabs, I believe it is the reason why my wireshark is crashing whenever I click on a tree item that refers to a newly created tvb. So now what I do is to get the value by this code: Let's say byteList have the byte locations I need from the payload. byteList = { 4, 5, 6, 7, 12, 13, 14, 15}