Based on 'Alternatively I could save the PDF's for later printing.'.
Partial sample code, to download a list of '.pdf' hypertext links.
-- AppleScript code begins here --
set pdf_URLs to {"http://apps.irs.gov/pub/irs-pdf/i1099h.pdf", "http://apps.irs.gov/pub/irs-pdf/i1099sa.pdf"}
-- Path of 'PDFs' folder (directory) - to be created, and downloaded '.pdf' files saved to (in UNIX format).
set base_Path to ((POSIX path of (path to desktop folder from user domain) as string) & "PDFs/")
try -- Trap 'directory already exists' (or equivalent) error, if folder (directory) already exists.
do shell script ("mkdir " & base_Path) -- If not exists already, create 'PDFs' folder, on current users' 'Desktop'.
end try
repeat with i in pdf_URLs
try -- Trap (and thus ignore) any unforseen 'exception' errors.
set {oAStID, AppleScript's text item delimiters} to {AppleScript's text item delimiters, "/"}
set file_Name to (last text item of i) as string
set AppleScript's text item delimiters to oAStID
do shell script ("curl " & i & " -o \"" & (base_Path & file_Name) & "\"") -- Download, and properly lable, desired '.pdf' files.
end try
end repeat
-- AppleScript code ends here --
I also included (beneath the above code) ...
try
do shell script ("cd " & base_Path & "; lpr *.pdf") -- Print all downloaded '.pdf' files.
end try
... which worked (somewhat). Somehow, the 'HP Utility' (a web page at 'http://npi23061e.local./') deems the print job (of each file) as a 'manual feed'; thus, I must click on a 'Enter' button to actually print, each '.pdf' file.
I may / may not investigate further, at a future time.