Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Due to my inability to use HAPI for data downloads, I have modified some of the code for network applications to be usable in Python 3.10 #52

Open
MZA1997 opened this issue Jun 3, 2024 · 1 comment

Comments

@MZA1997
Copy link

MZA1997 commented Jun 3, 2024

import urllib.request  
import ssl  
import json  

def queryHITRAN(TableName, iso_id_list, numin, numax, pargroups=[], params=[], dotpar=True, head=False):
    ParameterList = prepareParlist(pargroups=pargroups, params=params, dotpar=dotpar)
    TableHeader = prepareHeader(ParameterList)
    TableHeader['table_name'] = TableName
    DataFileName = VARIABLES['BACKEND_DATABASE_NAME'] + '/' + TableName + '.data'
    HeaderFileName = VARIABLES['BACKEND_DATABASE_NAME'] + '/' + TableName + '.header'
    
    iso_id_list_str = [str(iso_id) for iso_id in iso_id_list]
    iso_id_list_str = ','.join(iso_id_list_str)
    print('\nData is fetched from %s\n' % VARIABLES['GLOBAL_HOST'])
    
    if pargroups or params:  # custom par search
        url = VARIABLES['GLOBAL_HOST'] + '/lbl/api?' + \
            'iso_ids_list=' + iso_id_list_str + '&' + \
            'numin=' + str(numin) + '&' + \
            'numax=' + str(numax) + '&' + \
            'head=' + str(head) + '&' + \
            'fixwidth=0&sep=[comma]&' + \
            'request_params=' + ','.join(ParameterList)
    else:  # old-fashioned .par search
        url = VARIABLES['GLOBAL_HOST'] + '/lbl/api?' + \
            'iso_ids_list=' + iso_id_list_str + '&' + \
            'numin=' + str(numin) + '&' + \
            'numax=' + str(numax)
    
    if VARIABLES['DISPLAY_FETCH_URL']:
        print(url + '\n')
    
    try:
        
        context = ssl._create_unverified_context()
        
        if VARIABLES['PROXY']:
            print('Using proxy ' + str(VARIABLES['PROXY']))
            proxy = urllib.request.ProxyHandler(VARIABLES['PROXY'])
            opener = urllib.request.build_opener(proxy)
            urllib.request.install_opener(opener)
        
        req = urllib.request.urlopen(url, context=context)
    except urllib.error.HTTPError:
        raise Exception('Failed to retrieve data for given parameters.')
    except urllib.error.URLError:
        raise Exception('Cannot connect to %s. Try again or edit GLOBAL_HOST variable.' % VARIABLES['GLOBAL_HOST'])
    
    CHUNK = 64 * 1024
    print('BEGIN DOWNLOAD: ' + TableName)
    
    with open(DataFileName, 'w') as fp:
        while True:
            chunk = req.read(CHUNK)
            if not chunk:
                break
            fp.write(chunk.decode('utf-8'))
            print('  %d bytes written to %s' % (CHUNK, DataFileName))
    
    with open(HeaderFileName, 'w') as fp:
        fp.write(json.dumps(TableHeader, indent=2))
        print('Header written to %s' % HeaderFileName)
    
    print('END DOWNLOAD')
    
    storage2cache(TableName)
    print('PROCESSED')
@emccaughey
Copy link

I was also having a problem getting data downloads to work using hapi. Thank you for the fix!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants