nasanhak wrote:This is what I had in mind with texture editing:
Don't know if they look good or not
Looks good to me. Though I don't do texture editing so I wouldn't be able to give you a detailed critique or anything.
nasanhak wrote:Also Mr. CantStoptheBipBop, would you consider using this for your Radio Silence mod?
I honestly forgot about that mod. It sounds like a nifty function but I prefer keeping the optional mission codecs and one-time calls (like the first time you find digitalis) in, as well as the ones that indicate when the mission is complete, certain intro calls, etc.
Tex wrote:Alternate drop code that also drops support items
Cool inf addition. I tried merging it and that weapon drop mod awhile back but could never get them working together properly. For some reason it would cause me to spawn without a collision box (I guess) and fall below the map.
So I've been working on a lua script to automate the whole lang_dictionary process and it's basically complete. There are still some small inefficiencies in the code but nothing I'm too concerned about since lua's end of things already runs in a heartbeat. Now it can run more like actual brute-force and dictionary attacks instead of simply generating large text files that have to be cut down and edited for LangTool to use (apparently it doesn't run with enough memory to load files of absurd size).
The gist is that it loads a file with the current entries+new ones that it has generated into a table. Then it performs the dictionary attack function for one second, runs LangTool on everything or just selected files, pulls lines containing "LangId" with cmd's findstr command, loads those into a table and isolates the entries with regex, calls a function to remove entries that are duplicates from the original table, overwrites the file loaded at the start with values from the new table, and loops. The table duplicate functions are from
here.
Code: Select all
math.randomseed(os.time())
math.random()
local file=io.open("newDict.txt")
local t={}
for line in file:lines() do
t[#t+1]=line
end
os.execute[["type nul>lang_dictionary.txt"]]
os.execute[["for /r %G in (*.lng?) do (LangTool.exe "%G")"]]
local function table_count(argTable,argEntry)
local count
count=0
for _,v in pairs(argTable) do
if argEntry==v then
count=count+1
end
end
return count
end
local function table_unique(idTable,fullTable)
local newTable
newTable={}
for _,v in pairs(fullTable) do
idTable[#idTable+1]=v
end
for _,v in ipairs(idTable) do
if(table_count(newTable,v)==0) then
newTable[#newTable+1]=v
end
end
return newTable
end
local function dictionaryAttack()
local startOfScript=os.time()
local fileIn=io.open("dictList.txt")
local fileOut=io.open("lang_dictionary.txt", "w")
local t={}
for line in fileIn:lines() do
t[#t+1]=line
end
local d=#t
while true do
if os.time()>=startOfScript+1 then fileOut:flush(); fileOut:close(); break end
local a,b --,c
a=t[math.random(d)]
b=t[math.random(d)]
--c=t[math.random(d)]
fileOut:write("announce".."_"..a.."_"..b, "\n")
end
end
local function runLangTool()
local tC={}
local m={}
local file="tempIdComp.txt"
--os.execute[["for /r %G in (*.lng?) do (LangTool.exe "%G")"]]
os.execute[["LangTool.exe tpp_announce_log.eng.lng2"]]
os.execute[["LangTool.exe tpp_fob.eng.lng2"]]
os.execute[["findstr /r "LangId" *.xml>tempIdComp.txt"]]
file=io.open(file)
for line in file:lines() do
tC[#tC+1]=line
end
for _,v in pairs(tC) do
m[#m+1]=(v:match(".*LangId=.([a-z,A-Z,0-9,_]*).*"))
end
for i=1,#tC do
tC[#tC+1]=nil
end
m=table_unique(m,t)
return m
end
while true do
dictionaryAttack()
local m=runLangTool()
local file=io.open("newDict.txt", "w")
for _,v in pairs(m) do
file:write(v,"\n")
end
file:flush(); file:close()
file=io.open("newDict.txt")
t={}
for line in file:lines() do
t[#t+1]=line
end
end
It seems to work flawlessly but let me know if anyone has issues with it. I recommend sticking to a max of two-word generation with a common word of your choosing since that seems to be the best method of getting clean results. Just spamming LangTool with the three+ word combos gives back functional "wrong" entries, they just look ugly.